I'm looking into what is the best value to set for defaults in PHP. I've seen many contradicting points about max_input_time.
This answer says that he believes file uploading is not counted towards timers:
https://stackoverflow.com/a/3758522/518169
While on the official PHP documentation, there is a huge red warning saying:
max_input_time sets the maximum time, in seconds, the script is
allowed to receive input; this includes file uploads. For large or
multiple files, or users on slower connections, the default of 60
seconds may be exceeded
Source: http://php.net/manual/en/features.file-upload.common-pitfalls.php, last updated: Fri, 06 Jul 2012
So from this it seems to max_input_time does affect file uploading and to be sure that visitors can upload say 20 MB files even from slow or mobile connections, the default value of 60 is definitely not enough!
What do you recommend setting this value to? 300?
Also, is there any relationship between max_execution_time and max_input_time? For example like that max_execution_time needs to be bigger than max_input_time?
After some quick benchmarking I do not believe max_input_time has any bearing on handling large uploads by users with slow connections.
From http://us3.php.net/manual/en/info.configuration.php#ini.max-input-time
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. It is measured from the moment of receiving all data on the server to the start of script execution.
I'm using PHP 5.3.8 and used the following .htaccess config
php_value max_input_time 5
php_value max_execution_time 1
php_value upload_max_filesize "2048M"
php_value post_max_size "2048M"
My test script is:
<?php
if (!empty($_FILES)) {
echo '<pre>';
var_dump($_FILES);
echo '</pre>';
}
?>
<form enctype="multipart/form-data" method="POST">
File: <input name="userfile" type="file" />
<input type="submit" value="Upload" />
</form>
With several trials my 1.5G file takes around 16-17 seconds to upload, 4-5 seconds to process, and execution time is essentially 0.
With max_input_time 5 the script completes. With it set to 4 we get PHP Fatal error: Maximum execution time of 4 seconds exceeded in Unknown on line 0, referer: http://localhost/test-upload.php
It also seems max_execution_time has no bearing since we kept it at 1 throughout the tests.
I did extensive study on max_input_time. Network transfer time is not a factor. PHP as Apache handler (mod_php) or Nginx/PHP-FPM -pair yielded similar results: PHP gets the uploaded file once the transfer is completed and web server hands the data over. On my tests 2 second max_input_time was enough to handle a 800 MiB upload.
All the details are at http://blog.hqcodeshop.fi/archives/185-PHP-large-file-uploads.html
It's going to depend on how the PHP is bridged to the webserver.
Technically it's possible for the webserver to invoke PHP as soon as it has the request headers - in which case PHP is going to be twiddling it's thumbs waiting for the POST data to come across the internet until it can populate the request variables (it's quite possible that max_input_time will be exceeded). But more commonly, the webserver will delay the invocation of PHP until it has the the full request (it's a lot less likely that max_input_time wil be exceeded).
As of PHP 5.4, PHP file uploads can definitely be affected by max_input_time. I recently was getting a 500 error on files that took longer than 60 seconds to upload. I changed this single value in my php.ini and it went away.
In addition, the wording in the manual is different now from what is quoted in the accepted answer. It now says:
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. Timing begins at the moment PHP is invoked at the server and ends when execution begins.
I was using PHP 5.4.16 nts and IIS 7.5. Apparently, PHP is invoked before the file is uploaded.
One interesting thing to note is my PHP error logs gave the wrong error. They stated "PHP Fatal error: Maximum execution time of 10000 seconds exceeded in...". It didn't matter what I set max_execution_time to, it would give the same error with the new number.
Related
In a php application. I am uploading 20-30 files at once. Each files is around 100-200MB. Means more than 2GB of data i am uploading on server.
Because it takes time around 20-30 mins to upload. One general ajax pooling job getting cancelled after some time.
I have following configuration:
upload_max_filesize = 4096M
post_max_size = 4096M
max_input_time = 600
max_execution_time = 600
During this process my CPU consumption goes only upload 10-20%. I have 32 GB RAM and 12 CORE Linux machine.
Application is running on PHP 8.0, APACHE 2, MYSQL 8, Ubuntu 20.
Can anyone suggest what else i can check?
max_execution_time: This sets the maximum time in seconds a script is
allowed to run before it is terminated by the parser. This helps
prevent poorly written scripts from tying up the server. The default
setting is 30. When running PHP from the command line the default
setting is 0.
max_input_time: This sets the maximum time in seconds a script is
allowed to parse input data, like POST and GET. Timing begins at the
moment PHP is invoked at the server and ends when execution begins.
The default setting is -1, which means that max_execution_time is used
instead. Set to 0 to allow unlimited time.
I think change it:
max_input_time = 1800 & max_execution_time = 1800 (30 minutes)
I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);
I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?
This question already has answers here:
ini_set, set_time_limit, (max_execution_time) - not working
(3 answers)
Closed 8 years ago.
I am inserting a huge data values through CSV file via HTML form.
I have used set_time_limit(0); so that script run until the whole operation is not performed.
Fatal error: Maximum execution time of 300 seconds exceeded in
C:\xampp\htdocs\clytics\include\clytics.database.php on line 135
Now, I am trying to catch this fatal error.
I have used set_time_limit(0); so that script run until the whole
operation is not performed.
Probably needs more memory as well. Basically your code is just a hog & you need to tame the way it is gobbling up resources.
But that is just a tangent on the overall architecture issues you might be facing.
Specific to there issue, is there something in your code that would override that value of set_time_limit(0);?
Also, ar you running this script via the command line or the PHP in Apache? Because the CLI config php.ini is 100% different form the Apache module config of php.ini.
For example, on Ubuntu the Apache PHP php.ini is here:
/etc/php5/apache2/php.ini
But the command line (CLI) php.ini is here:
/etc/php5/cli/php.ini
And if you want to brute force your script to eat up memory regardless of your config settings, you can add this to the top of your PHP file:
ini_set('MAX_EXECUTION_TIME', -1);
If one reads up more on set_time_limit this comes up:
Set the number of seconds a script is allowed to run. If this is
reached, the script returns a fatal error. The default limit is 30
seconds or, if it exists, the max_execution_time value defined in the
php.ini.
Then reading up on max_execution_time this comes up:
This sets the maximum time in seconds a script is allowed to run
before it is terminated by the parser. This helps prevent poorly
written scripts from tying up the server. The default setting is 30.
When running PHP from the command line the default setting is 0.
But then, the magic 300 number shows up:
Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
So now you know where the 300 comes from. But doing ini_set('MAX_EXECUTION_TIME', -1); should let you run the script without a timeout.
And final bit of info if none of that somehow works: Look into max_input_time:
This sets the maximum time in seconds a script is allowed to parse
input data, like POST and GET. Timing begins at the moment PHP is
invoked at the server and ends when execution begins.
While max_input_time might not seem to be related, in some versions of PHP, there is a bug where max_input_time and max_execution_time are directly connected.
I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.