In a php application. I am uploading 20-30 files at once. Each files is around 100-200MB. Means more than 2GB of data i am uploading on server.
Because it takes time around 20-30 mins to upload. One general ajax pooling job getting cancelled after some time.
I have following configuration:
upload_max_filesize = 4096M
post_max_size = 4096M
max_input_time = 600
max_execution_time = 600
During this process my CPU consumption goes only upload 10-20%. I have 32 GB RAM and 12 CORE Linux machine.
Application is running on PHP 8.0, APACHE 2, MYSQL 8, Ubuntu 20.
Can anyone suggest what else i can check?
max_execution_time: This sets the maximum time in seconds a script is
allowed to run before it is terminated by the parser. This helps
prevent poorly written scripts from tying up the server. The default
setting is 30. When running PHP from the command line the default
setting is 0.
max_input_time: This sets the maximum time in seconds a script is
allowed to parse input data, like POST and GET. Timing begins at the
moment PHP is invoked at the server and ends when execution begins.
The default setting is -1, which means that max_execution_time is used
instead. Set to 0 to allow unlimited time.
I think change it:
max_input_time = 1800 & max_execution_time = 1800 (30 minutes)
Related
I need to run a really slow PHP/MySQL script once, on my local server.
The problem is that Laravel times out after 60 seconds with the message "Maximum execution time of 60 seconds exceeded".
I have set
max_execution_time = 360
and
max_input_time = 360
in my php.ini. The settings are there (checked phpinfo()) but Laravel still times out after 60 seconds. Is there anything in Laravel that I can set as well?
I dont think Laravel will override PHP settings. After changing the settings in your ini file, you have to restart the server to take effects.
So check whether you have restarted your server after the change in the settings
I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?
I have developed a page to upload a big CSV(greater than 7000 rows) file to the database. Sometimes it works fine, sometimes not. Sometimes, I am getting an error like "Data not received" in the middle of uploading. But the values are inserted into the database upto that moment. I have checked the php.ini in the server:
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 128M ; Maximum amount of memory a script may consume (128MB)
Server has a lot of projects, all of them are using zend framework. But I am using simple PHP for my project. My system admin said, php.ini file is / here, usr/local/zend/etc/php.ini
Any suggestions?
I have a VPS that runs XAMPP and gives service to an iPhone App that I made.
I used ASIHTTPRequest to upload files to the server.
The App sends files to the server, and the server accept only those who are lighter then 2MB.
I also checked with Wireshark and found this warning:
PHP Fatal error: Maximum execution time of 60 seconds exceeded in c:/xxx/index.php in line 2
in line 2 I wrote: session_start();
in my theory they are 2 things that block big files from entering my server:
Some kind of file size limit
Some kind of time limit per action
I really need help on this one. Thanks!
Check the settings in your php.ini file which, when running XAMPP, can be found in the *root*/php/ directory.
#Make sure file uploads are turned on
file_uploads = On
#Change the max upload size to 100Mb
upload_max_filesize = 100M
#Change the max post size to 100Mb
post_max_size = 100M
#Change the max upload time to 900seconds
max_input_time = 900
#This is where you are seeing your problem as the script execution is timing out.
#Change the max execution time of the script to 900 seconds
max_execution_time = 900
Check the following lines in your php.ini file:
upload_max_filesize = 2M
max_execution_time = 300
You might have to restart your server afterwards.
The error says: Maximum execution time of 60 seconds exceeded
This makes me think that your internet-connection is slow, thus the upload is taking more than the max_execution_time
To see what the max_execution_time currently is:
$maxtime = ini_get(max_execution_time);
echo $maxtime;
To make the max_execution_time bigger for the current page, enter this line on top of your PHP-file:
ini_set("max_execution_time", 600);?>
Put at the topo of your index.php:
<?php
ini_set('max_execution_time', 180); //Put the number of seconds that you want
The upload_max_filesize can't be change in runtime, so you need to increase this value in your php.ini
I'm looking into what is the best value to set for defaults in PHP. I've seen many contradicting points about max_input_time.
This answer says that he believes file uploading is not counted towards timers:
https://stackoverflow.com/a/3758522/518169
While on the official PHP documentation, there is a huge red warning saying:
max_input_time sets the maximum time, in seconds, the script is
allowed to receive input; this includes file uploads. For large or
multiple files, or users on slower connections, the default of 60
seconds may be exceeded
Source: http://php.net/manual/en/features.file-upload.common-pitfalls.php, last updated: Fri, 06 Jul 2012
So from this it seems to max_input_time does affect file uploading and to be sure that visitors can upload say 20 MB files even from slow or mobile connections, the default value of 60 is definitely not enough!
What do you recommend setting this value to? 300?
Also, is there any relationship between max_execution_time and max_input_time? For example like that max_execution_time needs to be bigger than max_input_time?
After some quick benchmarking I do not believe max_input_time has any bearing on handling large uploads by users with slow connections.
From http://us3.php.net/manual/en/info.configuration.php#ini.max-input-time
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. It is measured from the moment of receiving all data on the server to the start of script execution.
I'm using PHP 5.3.8 and used the following .htaccess config
php_value max_input_time 5
php_value max_execution_time 1
php_value upload_max_filesize "2048M"
php_value post_max_size "2048M"
My test script is:
<?php
if (!empty($_FILES)) {
echo '<pre>';
var_dump($_FILES);
echo '</pre>';
}
?>
<form enctype="multipart/form-data" method="POST">
File: <input name="userfile" type="file" />
<input type="submit" value="Upload" />
</form>
With several trials my 1.5G file takes around 16-17 seconds to upload, 4-5 seconds to process, and execution time is essentially 0.
With max_input_time 5 the script completes. With it set to 4 we get PHP Fatal error: Maximum execution time of 4 seconds exceeded in Unknown on line 0, referer: http://localhost/test-upload.php
It also seems max_execution_time has no bearing since we kept it at 1 throughout the tests.
I did extensive study on max_input_time. Network transfer time is not a factor. PHP as Apache handler (mod_php) or Nginx/PHP-FPM -pair yielded similar results: PHP gets the uploaded file once the transfer is completed and web server hands the data over. On my tests 2 second max_input_time was enough to handle a 800 MiB upload.
All the details are at http://blog.hqcodeshop.fi/archives/185-PHP-large-file-uploads.html
It's going to depend on how the PHP is bridged to the webserver.
Technically it's possible for the webserver to invoke PHP as soon as it has the request headers - in which case PHP is going to be twiddling it's thumbs waiting for the POST data to come across the internet until it can populate the request variables (it's quite possible that max_input_time will be exceeded). But more commonly, the webserver will delay the invocation of PHP until it has the the full request (it's a lot less likely that max_input_time wil be exceeded).
As of PHP 5.4, PHP file uploads can definitely be affected by max_input_time. I recently was getting a 500 error on files that took longer than 60 seconds to upload. I changed this single value in my php.ini and it went away.
In addition, the wording in the manual is different now from what is quoted in the accepted answer. It now says:
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. Timing begins at the moment PHP is invoked at the server and ends when execution begins.
I was using PHP 5.4.16 nts and IIS 7.5. Apparently, PHP is invoked before the file is uploaded.
One interesting thing to note is my PHP error logs gave the wrong error. They stated "PHP Fatal error: Maximum execution time of 10000 seconds exceeded in...". It didn't matter what I set max_execution_time to, it would give the same error with the new number.