This question already has answers here:
Best way to manage long-running php script?
(16 answers)
Closed 5 years ago.
I'm trying to download a file from a remote server with file_put_contents. This script is called via ajax. The problem i'm having is the script timeout when the file is large e.g. (500mb). I get 504 Gateway Timeout - nginx
download.php
$destination = "/home/mywebsite/public_html/wp-content/channels/videos/test.mp4";
$link = "http://www.example.com/videos/movie.mp4"; //500mb
$result = file_put_contents($destination, fopen($link, 'r'));
I'm using dedicated hosting. I've changed my php.ini and confirmed in phpinfo();
max_execution_time 7200
max_input_time 7200
max_input_vars 1000
memory_limit -1
output_buffering 4096
post_max_size 1200M
upload_max_filesize 1000M
This script keeps timing out. Is there another solution how do i solve? When i check the directory the file is successfully downloaded but the page times out. So i can't return any data via ajax.
How do i solve?
You should also change nginx fcgi timeout values. PHP script continues executing but your connection between nginx and PHP timeouts.
Make download asynchronous. Like one process only fill some DB or rabbitMq with download requests and other wil cosume it (maybe cron)
Related
I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?
This question already has answers here:
ini_set, set_time_limit, (max_execution_time) - not working
(3 answers)
Closed 8 years ago.
I am inserting a huge data values through CSV file via HTML form.
I have used set_time_limit(0); so that script run until the whole operation is not performed.
Fatal error: Maximum execution time of 300 seconds exceeded in
C:\xampp\htdocs\clytics\include\clytics.database.php on line 135
Now, I am trying to catch this fatal error.
I have used set_time_limit(0); so that script run until the whole
operation is not performed.
Probably needs more memory as well. Basically your code is just a hog & you need to tame the way it is gobbling up resources.
But that is just a tangent on the overall architecture issues you might be facing.
Specific to there issue, is there something in your code that would override that value of set_time_limit(0);?
Also, ar you running this script via the command line or the PHP in Apache? Because the CLI config php.ini is 100% different form the Apache module config of php.ini.
For example, on Ubuntu the Apache PHP php.ini is here:
/etc/php5/apache2/php.ini
But the command line (CLI) php.ini is here:
/etc/php5/cli/php.ini
And if you want to brute force your script to eat up memory regardless of your config settings, you can add this to the top of your PHP file:
ini_set('MAX_EXECUTION_TIME', -1);
If one reads up more on set_time_limit this comes up:
Set the number of seconds a script is allowed to run. If this is
reached, the script returns a fatal error. The default limit is 30
seconds or, if it exists, the max_execution_time value defined in the
php.ini.
Then reading up on max_execution_time this comes up:
This sets the maximum time in seconds a script is allowed to run
before it is terminated by the parser. This helps prevent poorly
written scripts from tying up the server. The default setting is 30.
When running PHP from the command line the default setting is 0.
But then, the magic 300 number shows up:
Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
So now you know where the 300 comes from. But doing ini_set('MAX_EXECUTION_TIME', -1); should let you run the script without a timeout.
And final bit of info if none of that somehow works: Look into max_input_time:
This sets the maximum time in seconds a script is allowed to parse
input data, like POST and GET. Timing begins at the moment PHP is
invoked at the server and ends when execution begins.
While max_input_time might not seem to be related, in some versions of PHP, there is a bug where max_input_time and max_execution_time are directly connected.
I have a site developed in php where there is a function to download a zip file, extract him, parse csv inside and insert into database.
This script is very long because the csv file is large (many MB depends of the content).
I have already tried with this:
ini_set('max_execution_time', 0);
ini_set('memory_size', '500000M');
set_time_limit(0);
ignore_user_abort(1);
but isn't working, my page in a few minutes go in timeout and block the script.
The configuration of my server is:
max_input_time -1
max_execution_time 0
memory_limit 512M
safe_mode Off
How can I escape this problem?
I have see many question about it but any answer is good for me
My PHP script executes some program on my server IIS 7.5
Takes time about 10 mins to execute but above error in browser.
How to resolve this.
Error:
HTTP Error 500.0 - Internal Server Error
C:\php\php-cgi.exe - The FastCGI process exceeded configured request timeout
Module FastCgiModule
Notification ExecuteRequestHandler
Handler FastCGI
Error Code 0x80070102
php.ini settings:
fastcgi.impersonate = 1
fastcgi.logging = 0
cgi.fix_pathinfo=1
cgi.force_redirect = 0
max_execution_time = 0
upload_max_filesize = 20M
memory_limit = 128M
post_max_size = 30M
C:\Windows\System32\inetsrv\config\ applicationHost.config file settings for fast-cgi
<fastCgi>
<application
fullPath="C:\php\php-cgi.exe" activityTimeout = "3600" requestTimeout = "300" />
</fastCgi>
This is sort of a quick explanation of what is going on. When you are using a CGI/FCGI configuration for PHP. The webserver (in this case IIS), routes requests that require php processing to the PHP process (which runs separately from the web server).
Generally, to prevent connections from getting stuck open and waiting (if php process happens to crash) the web server will only wait a set amount of time for the PHP process to return a result (usually 30-60 seconds).
In your configuration you have this:
requestTimeout = "300"
300 seconds = 5 minutes. IIS will cancel the request since your request takes 10 minutes to complete. Simple fix, increase the timeout to something 600 or greater.
Now, running a script for 10 minutes with an http request is not a good design pattern. Generally, http works best with short lived requests. The reason is that timeouts can exist in any part of the process (server, proxy, or client) and the script could be accidentally interrupted.
So, when you have a web application that has a long running job like this, the best way to run it is via console or job queue.
There is one more setting I found from this source that helped me with the same problem I was having.
Copied and pasted:
Open Server Manager
At Server Level (not default Web site)
Double click FastCGI Settings
open PHP.EXE listed there
Monitor changes to file php.ini
activity timeout default is 60s - change to 600s or whatever
This can be solve by adjusting fast-cgi configuration.
Goto "C:\Windows\System32\inetsrv\" and edit "fcgiext.ini" file
[PHP]
ExePath=C:\xampp\php\php-cgi.exe
MonitorChangesTo=C:\xampp\php\php.ini
ActivityTimeout=3600
IdleTimeout=3600
RequestTimeout=3600
Make sure to place ActivityTimeout,IdleTimeout & RequestTimeout inside [PHP] section as shown above.
This question already has answers here:
Best way to manage long-running php script?
(16 answers)
Closed 9 years ago.
I have a php script that will surely take very long time to process. Is it good to have it in set_time_limit(0); I have seen some times even when we set time limit to 0 we get 500 internal server error. What is best way to handle such long time consuming script in php?
If you try to execute a long running script from the browser, you have a chance of running into all sorts of timeouts, from php, proxies, web servers and browsers. You may not be able to control all of them.
Any long running script should be run from the command line. That way you take out all of the problems except for the php execution time limit which is easy to override.
If you have access to php.ini file then set max_execution_time in it like
max_execution_time = 300
or you can user .htaccess to handle it like
php_value max_execution_time 300
Script which runs more than 360 seconds(I hope max_execution_time) will throw an Internal server error automatically even set_time_limit is applied.
You can have this in your page.
<?php
ini_set('max_input_time', time in seconds);
ini_set('max_execution_time', time in seconds);
?>
or in php.ini
set max_input_time and max_execution_time whatever you like.
or in .htaccess
php_value max_input_time 300
php_value max_execution_time 300