I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?
Related
In a php application. I am uploading 20-30 files at once. Each files is around 100-200MB. Means more than 2GB of data i am uploading on server.
Because it takes time around 20-30 mins to upload. One general ajax pooling job getting cancelled after some time.
I have following configuration:
upload_max_filesize = 4096M
post_max_size = 4096M
max_input_time = 600
max_execution_time = 600
During this process my CPU consumption goes only upload 10-20%. I have 32 GB RAM and 12 CORE Linux machine.
Application is running on PHP 8.0, APACHE 2, MYSQL 8, Ubuntu 20.
Can anyone suggest what else i can check?
max_execution_time: This sets the maximum time in seconds a script is
allowed to run before it is terminated by the parser. This helps
prevent poorly written scripts from tying up the server. The default
setting is 30. When running PHP from the command line the default
setting is 0.
max_input_time: This sets the maximum time in seconds a script is
allowed to parse input data, like POST and GET. Timing begins at the
moment PHP is invoked at the server and ends when execution begins.
The default setting is -1, which means that max_execution_time is used
instead. Set to 0 to allow unlimited time.
I think change it:
max_input_time = 1800 & max_execution_time = 1800 (30 minutes)
I need to run a really slow PHP/MySQL script once, on my local server.
The problem is that Laravel times out after 60 seconds with the message "Maximum execution time of 60 seconds exceeded".
I have set
max_execution_time = 360
and
max_input_time = 360
in my php.ini. The settings are there (checked phpinfo()) but Laravel still times out after 60 seconds. Is there anything in Laravel that I can set as well?
I dont think Laravel will override PHP settings. After changing the settings in your ini file, you have to restart the server to take effects.
So check whether you have restarted your server after the change in the settings
I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200
I'm looking into what is the best value to set for defaults in PHP. I've seen many contradicting points about max_input_time.
This answer says that he believes file uploading is not counted towards timers:
https://stackoverflow.com/a/3758522/518169
While on the official PHP documentation, there is a huge red warning saying:
max_input_time sets the maximum time, in seconds, the script is
allowed to receive input; this includes file uploads. For large or
multiple files, or users on slower connections, the default of 60
seconds may be exceeded
Source: http://php.net/manual/en/features.file-upload.common-pitfalls.php, last updated: Fri, 06 Jul 2012
So from this it seems to max_input_time does affect file uploading and to be sure that visitors can upload say 20 MB files even from slow or mobile connections, the default value of 60 is definitely not enough!
What do you recommend setting this value to? 300?
Also, is there any relationship between max_execution_time and max_input_time? For example like that max_execution_time needs to be bigger than max_input_time?
After some quick benchmarking I do not believe max_input_time has any bearing on handling large uploads by users with slow connections.
From http://us3.php.net/manual/en/info.configuration.php#ini.max-input-time
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. It is measured from the moment of receiving all data on the server to the start of script execution.
I'm using PHP 5.3.8 and used the following .htaccess config
php_value max_input_time 5
php_value max_execution_time 1
php_value upload_max_filesize "2048M"
php_value post_max_size "2048M"
My test script is:
<?php
if (!empty($_FILES)) {
echo '<pre>';
var_dump($_FILES);
echo '</pre>';
}
?>
<form enctype="multipart/form-data" method="POST">
File: <input name="userfile" type="file" />
<input type="submit" value="Upload" />
</form>
With several trials my 1.5G file takes around 16-17 seconds to upload, 4-5 seconds to process, and execution time is essentially 0.
With max_input_time 5 the script completes. With it set to 4 we get PHP Fatal error: Maximum execution time of 4 seconds exceeded in Unknown on line 0, referer: http://localhost/test-upload.php
It also seems max_execution_time has no bearing since we kept it at 1 throughout the tests.
I did extensive study on max_input_time. Network transfer time is not a factor. PHP as Apache handler (mod_php) or Nginx/PHP-FPM -pair yielded similar results: PHP gets the uploaded file once the transfer is completed and web server hands the data over. On my tests 2 second max_input_time was enough to handle a 800 MiB upload.
All the details are at http://blog.hqcodeshop.fi/archives/185-PHP-large-file-uploads.html
It's going to depend on how the PHP is bridged to the webserver.
Technically it's possible for the webserver to invoke PHP as soon as it has the request headers - in which case PHP is going to be twiddling it's thumbs waiting for the POST data to come across the internet until it can populate the request variables (it's quite possible that max_input_time will be exceeded). But more commonly, the webserver will delay the invocation of PHP until it has the the full request (it's a lot less likely that max_input_time wil be exceeded).
As of PHP 5.4, PHP file uploads can definitely be affected by max_input_time. I recently was getting a 500 error on files that took longer than 60 seconds to upload. I changed this single value in my php.ini and it went away.
In addition, the wording in the manual is different now from what is quoted in the accepted answer. It now says:
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. Timing begins at the moment PHP is invoked at the server and ends when execution begins.
I was using PHP 5.4.16 nts and IIS 7.5. Apparently, PHP is invoked before the file is uploaded.
One interesting thing to note is my PHP error logs gave the wrong error. They stated "PHP Fatal error: Maximum execution time of 10000 seconds exceeded in...". It didn't matter what I set max_execution_time to, it would give the same error with the new number.
I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.