php Set_time_limit for a long running script [duplicate] - php

This question already has answers here:
Best way to manage long-running php script?
(16 answers)
Closed 9 years ago.
I have a php script that will surely take very long time to process. Is it good to have it in set_time_limit(0); I have seen some times even when we set time limit to 0 we get 500 internal server error. What is best way to handle such long time consuming script in php?

If you try to execute a long running script from the browser, you have a chance of running into all sorts of timeouts, from php, proxies, web servers and browsers. You may not be able to control all of them.
Any long running script should be run from the command line. That way you take out all of the problems except for the php execution time limit which is easy to override.

If you have access to php.ini file then set max_execution_time in it like
max_execution_time = 300
or you can user .htaccess to handle it like
php_value max_execution_time 300

Script which runs more than 360 seconds(I hope max_execution_time) will throw an Internal server error automatically even set_time_limit is applied.
You can have this in your page.
<?php
ini_set('max_input_time', time in seconds);
ini_set('max_execution_time', time in seconds);
?>
or in php.ini
set max_input_time and max_execution_time whatever you like.
or in .htaccess
php_value max_input_time 300
php_value max_execution_time 300

Related

PHP long running script alternative? [duplicate]

This question already has answers here:
Best way to manage long-running php script?
(16 answers)
Closed 5 years ago.
I'm trying to download a file from a remote server with file_put_contents. This script is called via ajax. The problem i'm having is the script timeout when the file is large e.g. (500mb). I get 504 Gateway Timeout - nginx
download.php
$destination = "/home/mywebsite/public_html/wp-content/channels/videos/test.mp4";
$link = "http://www.example.com/videos/movie.mp4"; //500mb
$result = file_put_contents($destination, fopen($link, 'r'));
I'm using dedicated hosting. I've changed my php.ini and confirmed in phpinfo();
max_execution_time 7200
max_input_time 7200
max_input_vars 1000
memory_limit -1
output_buffering 4096
post_max_size 1200M
upload_max_filesize 1000M
This script keeps timing out. Is there another solution how do i solve? When i check the directory the file is successfully downloaded but the page times out. So i can't return any data via ajax.
How do i solve?
You should also change nginx fcgi timeout values. PHP script continues executing but your connection between nginx and PHP timeouts.
Make download asynchronous. Like one process only fill some DB or rabbitMq with download requests and other wil cosume it (maybe cron)

Wordpress timeouts at 30 seconds

I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);

How to prevent timeout when running a time consuming PHP

I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?

How do I handle Fatal Error in PHP? [duplicate]

This question already has answers here:
ini_set, set_time_limit, (max_execution_time) - not working
(3 answers)
Closed 8 years ago.
I am inserting a huge data values through CSV file via HTML form.
I have used set_time_limit(0); so that script run until the whole operation is not performed.
Fatal error: Maximum execution time of 300 seconds exceeded in
C:\xampp\htdocs\clytics\include\clytics.database.php on line 135
Now, I am trying to catch this fatal error.
I have used set_time_limit(0); so that script run until the whole
operation is not performed.
Probably needs more memory as well. Basically your code is just a hog & you need to tame the way it is gobbling up resources.
But that is just a tangent on the overall architecture issues you might be facing.
Specific to there issue, is there something in your code that would override that value of set_time_limit(0);?
Also, ar you running this script via the command line or the PHP in Apache? Because the CLI config php.ini is 100% different form the Apache module config of php.ini.
For example, on Ubuntu the Apache PHP php.ini is here:
/etc/php5/apache2/php.ini
But the command line (CLI) php.ini is here:
/etc/php5/cli/php.ini
And if you want to brute force your script to eat up memory regardless of your config settings, you can add this to the top of your PHP file:
ini_set('MAX_EXECUTION_TIME', -1);
If one reads up more on set_time_limit this comes up:
Set the number of seconds a script is allowed to run. If this is
reached, the script returns a fatal error. The default limit is 30
seconds or, if it exists, the max_execution_time value defined in the
php.ini.
Then reading up on max_execution_time this comes up:
This sets the maximum time in seconds a script is allowed to run
before it is terminated by the parser. This helps prevent poorly
written scripts from tying up the server. The default setting is 30.
When running PHP from the command line the default setting is 0.
But then, the magic 300 number shows up:
Your web server can have other timeout configurations that may also
interrupt PHP execution. Apache has a Timeout directive and IIS has a
CGI timeout function. Both default to 300 seconds. See your web server
documentation for specific details.
So now you know where the 300 comes from. But doing ini_set('MAX_EXECUTION_TIME', -1); should let you run the script without a timeout.
And final bit of info if none of that somehow works: Look into max_input_time:
This sets the maximum time in seconds a script is allowed to parse
input data, like POST and GET. Timing begins at the moment PHP is
invoked at the server and ends when execution begins.
While max_input_time might not seem to be related, in some versions of PHP, there is a bug where max_input_time and max_execution_time are directly connected.

PHP: Processing thousands of entries, script dies after certain amount

I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200

Categories