PHP configuration: max_execution_time and max_input_time - php

Can I set the following PHP configuration parameters as follows:
max_execution_time = 360
max_input_time 360
Is that safe and efficient ?
I actually need my user to upload large videos with the php-based Content Management System.
So, each video upload takes some minutes. Do I need to change both and the values are good ?
thanks

In my understanding, you have to change neither.
If you just store the video files using move_uploaded_file, you will not need to increase your max_execution_time as upload time does not count towards execution time.
The manual says the following about max_input_time (emphasis mine):
This sets the maximum time in seconds a script is allowed to parse input data, like POST, GET and file uploads.
I have not tested this, but to me this sounds like it doesn't include the actual time the client spends uploading the file, just the time it takes to copy it to the temporary directory. I can't vouch for this though, and I can't find any info on it. The default of 60 seconds should be ample time to parse many hundreds of megabytes of files.
I'd recommend to find out the perfect value using real-life tests. If your connection is too fast, use a tool to slow it down. See this SO question for suggestions:
Network tools that simulate slow network connection

In my case, max_input_time does affect my move_uploaded_file function. I failed to upload a 3GB file with default setting (max_input_time=60) but it succeeded with a larger value (max_input_time=300).
My PHP version is 7.2.19 on LAMP enviroment.

By default my server has max_input_time as -1. I'm assuming that means infinite.

Related

Is there a reason for keeping max_file_uploads at a low value?

So while configuring a server for uploading files I noticed the default for max_file_uploads is 20 files. Is there any reason to keep this at a low value or is it safe to up it to 100 files?
This will depends about your server resources (bandwidth, memory, cpu, etc) If you have a powerfull server and you need to download at the same time 100 files, go head and change it to 100 otherwise keep it as low possible

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

PHP file_get_contents() Timeout?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

PHP POST requests timeout

I'm currently working on a upload script supporting larger uploads (~50 Mb) and I have very rapidly run into a problem! I'm using the traditional POST request with a form uploading the file to a temp location and later moving it with PHP. Naturally I've updated my php.ini file to support slightly larger than default files and files around 15 Mb upload really well!
The main problem is due to my hosting company. They let scripts timeout after 60 seconds meaning that POST requests taking longer than 60 seconds to complete will die before the temp file reaches the PHP script and this naturally yields an internal server error.
Not being able to crank the timeout on the server (after heated debates) I'm considering the options. Is there a way to bump the request or somehow refresh it to notify the server and reset the timing? Or are there alternative upload methods that don't timeout?
There are a few things you could consider. Each has a cost, and you'll need to determine which one is least costly.
Get a new hosting company. This may be your best solution.
Design a rather complex client-side system that breaks up the upload into multiple chunks and submits them via AJAX. This is ugly especially since it is only useful in getting around a host rule.
I'd really research #1.
With great difficulty. By far your easiest option is to dump the hard-headed host and pick one that actually lets you be productive. I personally use TSOHost - been with them for over a year and a half and have so far had absolutely no reason to complain (not even a slight annoyance).
Are you really sure it s a timeout issue? My first idea...
the transfert failed due to a configuration limitation set up in the web server php.ini file. You need to change it or set it as local settings in your script
# find it in php.ini used by your configuration
memory_limit = 96M
post_max_size = 64M
upload_max_filesize = 64M
Or directly inyour script
ini_set('memory_limit', '96M');
ini_set('post_max_size', '64M');
ini_set('upload_max_filesize', '64M');

Best approach for speedy debug and efficiently fixing large image files upload failure

Consider a normal PHP image upload functionality (not using AJAX) and there occurs this problem of large image upload failing occasionally - less frequently on one test server and more frequently on another test server. Assuming the debugger has not yet started debugging the problem and there are no file/folder permission issues, how to proceed?
I am sure I have file_uploads on. I do not want to just blindly set some safe values or increase the values until it works. Basically, I want the values to be exactly as per my concerned modules. I am ready to override the settings in my concerned modules, if that is the best approach.
According to settings related to file upload, these are all the relevant/related settings -
* file_uploads
* upload_max_filesize
* max_input_time
* memory_limit
* max_execution_time
* post_max_size
Finding parameters/values for concerned script -
So that I can find out which one of them is/how many of them are actually being violated by my script and causing the failure, I need to first find the corresponding values for my script. How to find the following values for my script:
Total uploaded files size
Input time
Memory usage
Script execution time
Posted data size
Which tool(s) can be used for the same. Using PHP code, I think, I can find out a few:
Script execution time - Difference between microtime(true) at script start and end.
Total Uploaded file size - Foreach loop on $_FILES to find the sum of ['size'] attribute
How to find out the rest like Memory Usage, Input time etc.?
Where/How to override
Finally, when I have found the violating setting(s), suppose I need to increase/override values for 2 of the settings. Where to apply the override? I guess it is not correct to set memory_limit etc. for all the modules in htaccess or in PHP script. Rather, applying only in the concerned module is better. Am I correct?
Settings for Less demanding modules
Also, for other modules, where much resources are not needed, is it good/wise to override the settings to reduce them, after carefully studying the resource requirements of the modules? Will it reduce unnecessary resource consumption? If so, how about having 2 or 3 combinations of these settings (depending on the project requirements, naming them normal-script, heavy-file-upload) and calling a single function to load any one combination for every module?
memory_limit precautions
Regarding memory_limit it is mentioned here that -
Setting too high a value can be very dangerous because if several uploads are being handled concurrently all available memory will be used up and other unrelated scripts that consume a lot of memory might effect the whole server as well.
What general precautions to take about this?
Thanks,
Sandeepan
A few ideas for debugging:
For manual testing, I would prepare a series of images with different dimensions whose net size (width x height) increases in small steps: 100 x 100, 100 x 200, 100 x 300 .... and try them. At some point, they could start failing if the problem is the memory limit. You could turn on error_reporting() for yourself only (maybe using a debugging cookie of some sort) so you see what exactly fails.
If that's not an option, I would set up a mechanism of some sort for long-term logging that stores the image's dimensions into a log file or table before the resizing starts, and also the contents of the $_FILES array. At the successful end of the script, add an "OK" to that entry. That way, you will be able to find out more about the failed uploads, if they make it through to the script (and don't fail beforehand due to a timeout setting).
Also, for other modules, where much resources are not needed, is it good/wise to override the settings to reduce them
I think the answer is always "no". As far as I know, the memory limit is the maximum limit of memory that can be allocated, but that amount is not reserved for every request. I have never heard of anybody fine-tuning the memory limit in this way.
However, if some parts of the system (e.g. the image resizer) require an enormously high memory limit, it may be wise to apply specific memory_limit settings only to them, e.g. through a .htaccess setting.

Categories