I have a PHP script that will be executed by requests from the application admins. It does lots of stuff and takes at least 20 minutes(depending on the database size)
the Apache TimeOut directive is set to 300(5 minutes) which closes the connection and returns 500 status code after my PHP script is finished if it takes longer time to execute
Setting the PHP ini max_execution_time for long time for this script is useless.
<?php
// long script
ini_set("max_execution_time", 3600);// 1 hour
// Apache still responses with the same "Connection: close" header and 500 status code
And I don't want to change the entire Apache TimeOut directive just for those couple of scripts, because if I did, any request will be able to take very long time which makes a scope for DDOS vulnerabilities, is this right?
Is there any way to allow this script only to run longer at the Apache level ?
Have you tried PHP's set_time_limit() method?
https://www.php.net/manual/en/function.set-time-limit.php
In addition to setting the initial execution time, the manual says that calling it resets the time expended to zero when called, and starts the counter again to the limit provided.
So if you want to be sure, you could just call set_time_limit(0) (0 == no limit) regularly throughout your script to make sure that you don't ever hit a limit (even though you're setting an infinite limit by passing in 0).
Related
We've inherited a platform that has a crobjob that every minute curls a local php script three times with different parameters (curl -s -o --url https://localhost/myscript.php?option=XYZ -k). That script runs for about 1 minute and its possible multiple instances with the same option overlap for a bit of time. The script logs in a different file per option given and each log starts with the timestamp when the script started so it acts as an instance identifier.
The script has this skeleton:
<?php
$option=XYZ;
$scriptId = time();
$file = "log_$option.txt";
file_put_contents($file,"\n$scriptId: Start\n",FILE_APPEND);
session_start();
$expires = time()+60;
file_put_contents($file,"\n$scriptId: Expires at $expires\n",FILE_APPEND);
while(time()<$expires){
file_put_contents($file,"\n$scriptId: Not expired at ".time()."\n",FILE_APPEND);
switch($option){
case X:
do_db_stuff();
break;
...
}
file_put_contents($file,"\n$scriptId: Will sleep at ".time()."\n",FILE_APPEND);
sleep(13);
file_put_contents($file,"\n$scriptId: Woke up at ".time()."\n",FILE_APPEND);
}
file_put_contents($file,"\n$scriptId: Finished at ".time()."\n",FILE_APPEND);
Normally this script runs fine (even if they overlap when instance A is sleeping for the last time and instance B starts) but on occasion we have two issues that we can confirm with the logs:
sometimes it sleeps for less than 13 seconds (a
variable amount of time and always less than 13);
sometimes the script stops (no more logging after the "Will sleep" one, and we can verify that no db stuff is being done). [Update on this in Edit 2]
We've looked into the possible causes but couldn't find any:
php max_execution_time is set to 240 seconds and the script never
takes more than one and a half minutes;
sleep documentation says it is per session but curl isn't using cookies so it should be different sessions in every instance (and also if it was using the same it would always block since we always execute three script instances, which it doesn't);
the hosting tech team says there are no errors neither in the server
error log nor in php error log in the timestamp where these issues
happen.
I can't reproduce the issues at will, but they happen at least once a day.
What I'd like to know is what can be interfering with the sleep behaviour? How can I detect or fix it?
Additional information:
linux system
mysql 5.5
apache
php 5.3
php max_execution_time set to 240
Edit 1: Just to clarify: actually we have 3 options, so it writes to 3 log files, one for each option. At any given time there can be up to two instances per option running (each instance of the same option overlaps a small amount of time).
Edit2: As per #Jan suggestion, I added log to the sleep function result. The script stopped once with that log already:
[2016-01-05, 13:11:01] Will sleep at 2016-01-05, 13:11:29
[2016-01-05, 13:11:01] Woke up at 2016-01-05, 13:11:37 with sleep return 5
[2016-01-05, 13:11:01] Not expired at 2016-01-05, 13:11:37
[2016-01-05, 13:11:01] Will sleep at 2016-01-05, 13:11:37
[2016-01-05, 13:11:01] Woke up at 2016-01-05, 13:11:38 with sleep return 13
... no more log from instance [2016-01-05, 13:11:01] ...
[2016-01-05, 13:12:01] Start
According to the sleep documentation:
If the call was interrupted by a signal, sleep() returns a non-zero value. On Windows, this value will always be 192 (the value of the WAIT_IO_COMPLETION constant within the Windows API). On other platforms, the return value will be the number of seconds left to sleep.
So according to the documentation and the log it seems that the sleep is being cut short due to an interrupt.
How can I know what interrupt caused this (pcntl_signal?), where did it come from and is there any way to avoid it?
Edit3: I've added code to handle signals with pcntl_signal (trying to register from signal 1 till 255) and log them, the issue still happens but the log is empty still.
You can define signal handlers with pcntl_signal.
With those handlers you can log when a interruption happens. But AFAIK you can't detect where it comes from.
Also you can use pcntl_alarm for your delayed jobs.
Check PHP Manual - PCNTL Alarm
Given your stack, the interrupt signal could come from apache. Depending on how Apache is communicating with PHP on your stack, there are several timeout options in Apache configuration that could infer with the script execution.
If your cron calls a curl on localhost, maybe it could directly call the PHP file instead, and so avoid using apache processes to keep the request alive? You'll have to edit your dedicated CLI php.ini file with the max_execution_time value.
Scenario: a php script with som pre-code, a require xxxx.php; and some postcode. Wanting and believing that because of using require and not include, the script will stop at the "require" statement if xxxx.php times out - is that correct?
Or asked in another way: is a timeout the same as the script throwing an error?
From the documentation:
max_execution_time integer
This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30. When running PHP from the command line the default setting is 0.
The maximum execution time is not affected by system calls, stream operations etc. Please see the set_time_limit() function for more details.
You can not change this setting with ini_set() when running in safe mode. The only workaround is to turn off safe mode or by changing the time limit in the php.ini.
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.
Official Documentation for set_time_limit()
First of all the script take sometimes to execute and it stops after 30 seconds. Says that the execution time is max about 30 seconds (I know that we can change this param in the httpconf)
but I don't know the max execution time may be 1 hour or more.
So I want to know if there is no time limit execution in php?
Second question: how to show the content in a browser when it's ready and continue the execution, because I always wait the script to end in order to see the content displayed in the browser.
Next time you post a question, make sure to research it better. Both questions are well documented on stackoverflow and the internet.
now to your answer...
To let it run until finished regardless of the time it takes:
http://www.php.net/manual/en/function.set-time-limit.php
set_time_limit(0);
To display content right after it is generated:
http://www.php.net/manual/en/function.flush.php
ob_flush();
flush();
A script that might run that long should be run from the console. Browser timeouts could also interfere with running it via browser.
If you use a console script, you could route the output into a database and query it from there.
You have in php.ini this code. I am not sure you could check it from elsewhere than php.ini.
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
You can't do that in php. The script runs until it finishes execution (normal, or error).
I have an upload form that uploads mp3s to my site. I have some intermittent issues with some users which I suspect to be slow upload connections...
But anyway the first line of code is set_time_limit(0); which did fix it for SOME users that had connections that were taking a while to upload, but some are still getting timed out and I have no idea why.
It says the script has exceeded limit execution of 60 seconds. The script has no loops so it's not like it's some kind of infinite loop.
The weird thing is that no matter what line of code is in the first line it will always say "error on line one, two, etc" even if it's set_time_limit(0);. I tried erasing it and the very first line of code always seems to be the error, it doesn't even give me a hint of why it can't execute the php page.
This is an issue only few users are experiencing and no one else seems to be affected. Could anyone throw some ideas as to why this could be happening?
set_time_limt() will only effect the actual execution of the PHP code on the page. You want to set the PHP directive max_input_time, which controls how long the script will accept input (like files) for. The catch is that you need to set this in php.ini, as if the default max_input_time is exceeded, it'll never reach the script which is attempting to change it with ini_set().
Sure, a couple of things noted in the PHP Manual.
Make sure PHP is not running in safe-mode. set_time_limit has no affect when PHP is running in safe_mode.
Second, and this is where I assume your problem lies.....
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
So your stream may be the culprit.
Can you post a little of your upload script, are you calling a separate file to handle the upload using Headers?
Try ini_set('max_execution_time', 0); instead.
How can I rewrite this into a cron that will run every day for longer than 30 seconds? Also, do I need to edit the .htaccess or php.ini file in the cron.php directory to say something? Over the browser it runs just fine for longer than 30 seconds; over the shell, it runs just fine too. But as a cron set task, it dies after 30 seconds. I'm on 1and1 share hosting.
0 12 * * * php5 /this/is/the/file/cron.php
There are several things that could be terminating your script. One could be the maximum execution time set in the php.ini file. If that's the case, you can override it in your script with set_time_limit(0); where zero means no limit and any number greater than zero is the number of seconds to allow the script to run for before being terminated. It's important to note that this time does NOT include the time it takes for the browser to make the request, so file upload time wouldn't count here.
If you're in a shared hosting environment (like Dreamhost), they have process watches that will kill off any PHP process after a set time limit. You cannot get around these. You would need to contact the hosting provider to see what you need to do to get access to run the script for longer (for Dreamhost, they want you to have a they're PS offering).
Use this syntax to start php:
php -c /path/to/another/php.ini /this/is/the/file/cron.php
Then you can specify a different timeout (or no timeout) in a different php.ini file.
ini_set('max_execution_time', 600);
Add this to the top of your php file and it will run for 600 seconds. Anything more is not recommended but you can have a go if you want.
You don't need to set a higher max_execution_time if you use PHP CLI:
CLI SAPI default value for "max_execution_time" is set to unlimited.
http://nl3.php.net/manual/en/features.commandline.differences.php
I would just use wget http://path.to.myscript.php
If it's dying after 30 you may need to set max_execution_time = 60 in your php.ini to allow the script to run longer than 30 seconds.
You could also use ini_set('max_execution_time', 60)
But as the manual page says, in some cases (i.e. running in safe mode) this will have no effect at all: http://uk.php.net/manual/en/info.configuration.php#ini.max-execution-time
It could also be possible that the php.ini for client line has different max execution values than for the browser. I have seen it sometimes.