I'm running a php script from the command line. I've set max_execution_time to 2400, 24000, and 0 to no effect. At 20 minutes the script ends.
I've also tried running it by passing exec time in the command line via -d sex_max_execution_time=#### - again, ends after 20 minutes.
I thought when running it in CLI mode it defaults to unlimited execution time.
Anyone have any ideas why this keeps ending after 20 minutes even when i specifically set it for a longer period?
This is on a windows 2012 server using php 5.3
nothing shows up in the php error log.
Try adding this line into your PHP:
set_time_limit(0);
That specifies the time limit for script execution is unlimited.
Related
I have a rundeck job with a script workflow step that has some simple commands, like the following.
echo "Some text"
pwd
ls -la
echo "Starting script now . . ."
# This script can take a while, maybe 20 to 30 minutes
php ./some_long_running_script.php arg1 arg2
echo "Finished!"
The issue I'm having is that when the job runs it kills this bash script after some time. It is usually after 8 to 10 minutes, but it is not consistent.
When it kills, the following is output to the log:
/var/lib/rundeck/tmp-jobs/372376-1775772-my-server-com-staging2-dispatch-script.tmp.sh: line 13: 15425 Killed
Where line 13 corresponds to the line number of the long running PHP script.
When I manually run the script on the same server node, I do not see this issue and the script runs to completion.
Does Rundeck kill jobs/scripts that it thinks are taking too long? The job level timeout is currently set to blank, which according to the documentation means no timeout.
I assume you didn't set timeout in your job definition.
There is another timeout setting in Rundeck SSH plguin. You can set it in different level(node,project,rundeck)
For node level:
ssh-connection-timeout connection timeout
ssh-command-timeout command timeout
The default value is 0 (no timeout)
The config file is framework.properties under rundeck base dir
Specifying SSH Timeout options
Any idea why a simple infinite loop stops on the very first minute of every hour? For instance, 21:01, 22:01, 23:01, etc.
The server is running Ubuntu 12.04 and the PHP script is launched using command: "php -f test.php"
while (1 == 1) {
echo "test";
sleep(30);
}
Any help is appreciated.
UPDATE: It doesn't matter whether I run the script 16:05 or 16:49, it will stop on 17:01, so the issue is not related to the set_time_limit value.
UPDATE: If the script has to sleep for an hour and then echo "test", the "test" will not be echoed. The script stops it's work without finishing the loop itself.
UPDATE: It seems that I've found out what's killing the script. I've set the PHP script to report All errors and right before the *:01 time comes up, I get the text: Terminated. I've Googled that the script might get Terminated by OOM killer and, unfortunately, I don't have permission to change it's settings on my current VPS. I'm switching to VDS and will try to modify OOM killer settings.
i guess it's because of Limits the maximum execution time on php .
if you want repeat a script every day , hour and minute you should use Cronjob to call your script every time you need.
cronjob available in popular web hosts like direct admin and cpanel
for example you can use below command for run your script every one minute
*/1 * * * * /usr/local/bin/php -q /home/user/domains/domain.com/public_html/script.php
if you want run script every second you can set maximum execution to one minute and repeat a code every second in it and use cronjob to call it every minute
If you are using php file through apache i.e. as a URL then you should do
ini_set('max_execution_time', 0); //0=NOLIMIT
If you are running through CLI this should not be happening generally
UPDATE
Below are few Notes form php.net. I hope it will help you
Warning This function has no effect when PHP is running in safe mode.
There is no workaround other than turning off safe mode or changing
the time limit in the php.ini.
Note: The set_time_limit() function and the configuration directive
max_execution_time only affect the execution time of the script
itself. Any time spent on activity that happens outside the execution
of the script such as system calls using system(), stream operations,
database queries, etc. is not included when determining the maximum
time that the script has been running. This is not true on Windows
where the measured time is real.
So, it is possible that PHP code doesn't actually breaks. But, any other conenction/call is getting broken.
On my ubuntu system there is a script in /etc/cron.d/php5
# /etc/cron.d/php5: crontab fragment for php5
# This purges session files older than X, where X is defined in seconds
# as the largest value of session.gc_maxlifetime from all your php.ini
# files, or 24 minutes if not defined. See /usr/lib/php5/maxlifetime
# Look for and purge old sessions every 30 minutes
09,39 * * * * root [ -x /usr/lib/php5/maxlifetime ] && [ -x /usr/lib/php5/sessionclean ] && [ -d /var/lib/php5 ] && /usr/lib/php5/sessionclean /var/lib/php5 $(/usr/lib/php5/maxlifetime)
That script /usr/lib/php5/maxliftime will check for param
session.gc_maxlifetime
in any php.ini file located at /etc/php5/*/php.ini
Check for that value via
cd /etc/php5
grep -ri gc_maxlifetime .
If there is a value 3600 you should got it.
I have a PHP cron job that is failing after running for 29 minutes. The error in the log (/var/log/php_errors.log) is:
[01-Mar-2012 00:32:57 UTC] PHP Fatal error: Maximum execution time of 60 seconds exceeded in /path/file.php on line 2079
The crontab entry that triggers the cron is:
00 00 * * * /usr/bin/php /path/file.php
From my research I don't think this is related to the max_execution_time config setting because:
I know for a fact it ran for 29:18 mins (i.e much more than 60s like the error message).
From the PHP docs - When running PHP from the command line the default setting is 0.
Q: Why is the script terminating early?
Notes:
The script is very heavy, and does run many thousands of DB queries, but I was running top and the CPU load wasn't high.
The line from the error log is a mysql_query call:
$sql = "SELECT SUM(amount) FROM mytab WHERE mem = '$id' AND validto > '$now'";
$res = mysql_query($sql);
> php -v
PHP 5.3.10 (cli) (built: Feb 2 2012 17:34:38)
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies
with Suhosin v0.9.33, Copyright (c) 2007-2012, by SektionEins GmbH
> cat /etc/redhat-release
Red Hat Enterprise Linux Server release 5.7 (Tikanga)
Update - I found out why the script can run for 29 minutes of real time but PHP can exit quoting execution time much lower.
Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running.
(from the set_time_limit() docs, but also mentioned in the max-execution-time docs). This was relevant for me because most of the script was long running db queries and payment API calls that won't have been clocking up execution time.
Well, you can set a bigger value for time limit, or you can set it unlimited using set-time-limit():
<?php set_time_limit(0); ?>
but acctually I use this too at the start of the script
ignore_user_abort(1);
If you get PHP Fatal error: Maximum execution time of 60 seconds exceeded then for sure some piece of running PHP code runs statement set_time_limit(60) somewhere. PHP CLI mode may default to no timelimit but if any code path ever sets time limit, it will be honored. The reason that PHP did run for close to half an hour is because set_time_limit sets limit for CPU time and if the process is I/O limited or waiting for other processes, the total CPU usage would hit 60 second mark much later in real time clock.
Try searching all your source code for set_time_limit. If you don't find anything then add set_time_limit(0) at the start of the script to make sure that the 60 second limit does not come from locally modified configuration file. For example, on Ubuntu LTS the PHP CLI configuration is defined in /etc/php5/cli/php.ini.
Unfortunately i can't write a comment, so my question here would be, what happens if you run this manually? Will it also time out?
If it does not time out when you run it manually, i would suggest that you call a little shell script, which actually runs a shell and runs "/usr/bin/php /path/file.php" within the shell.
00 00 * * * /usr/local/scripts/start_php_job.sh
File: /usr/local/scripts/start_php_job.sh
#!/usr/bin/bash
date
/usr/local/bin/php /path/to/script
date
I ran a script for 2 hours via the command line. The script ran on Mac OS X and queries the database for a list of job and generates a pdf and sends an e-mail over and over again. It keep going and didn't fail. The script ran from the command line. I thought it only had 30 seconds.
PHP CLI vs. PHP CGI
max_execution_time: CLI default is 0 (unlimited)
(for the record: CLI - command line interface)
You can also set ,ax time execution in your script.
void set_time_limit (int $seconds)
Note that this work only when safe mode off.
This function has no effect when PHP
is running in safe mode. There is no
workaround other than turning off safe
mode or changing the time limit in the
php.ini.
I have a command that when run direct on the command line works as expected. It runs for over 30 seconds and does not throw any errors. When the same command is called through a PHP script through the php function exec() (which is contained in a script called by a cron) it throws the following error:
Maximum execution time of 30 seconds
exceeded
We have a number of servers and i have run this command on a very similar server with the exact same dataset without any issues so i'm happy there is no script-level issue. I'm becoming more inclined to think this is related to something at the server level - either in the PHP setup or the server setup in some way but really not sure where to look. For those that are interested both servers have a max execution time of 30 seconds.
the command itself is called like this -
from command line as:
root#server>php -q /path/to/file.php
this works...
and via cron within a PHP file as:
exec("php -q /path/to/file.php");
this throws the max execution time error. it was always my understanding that there was no execution time limit when PHP is run from the command line.
I should point out that the script that is called, calls a number of other scripts and it is one of these scripts that is erroring. Looking at my logs, the max execution time error actually occurs before 30 seconds has even elapsed too! So, less than 30 seconds after being called, a script, called by a cron script that appears to be running as CLI is throwing a max execution error.
To check that the script is running as i expected (as CLI with no max execution time) i performed the following check:
A PHP script containing this code:
// test.php
echo exec("php test2.php");
where test2.php contains:
echo ini_get('max_execution_time');
and this script is run like this:
root#server> php test.php
// returns 0
This proves a script called in this way is running under CLI with a max execution time of 0 which just proves my thoughts, i really cannot see why this script is failing on max execution time!
it seems that your script takes too much time to execute, try to
set time limit, http://php.net/manual/en/function.set-time-limit.php
or check this post:
Asynchronous shell exec in PHP
Does the command take over 30 seconds on the command line? Have you tried increased the execution timeout in the php.ini?
You can temporarily set the timeout by including this at the top of the script. This will not work when running in safe mode as is specified in the documents for setting max_execution_time with ini_set().
<?php
ini_set('max_execution_time', 60); // Set to be longer than
// 60 seconds if needed
// Rest of script...
?>
One thing of note in the docs is this:
When running PHP from the command line
the default setting is 0.
What does php -v | grep cli, run from both the shell and in the exec command from the cron-loaded php file show?
Does explictly typing /usr/bin/php (modify as appropriate) make any difference?
I've actually found what the issue is (kinda). It seems that its maybe a bug with PHP reporting max_execution_time to be exceeded when the error is actually with max_input_time as described here
I tried changing the exec call to php -d max_execution_time=0 -q /path/to/file.php and i got the error "Maximum execution time of 0 seconds exceeded" which makes no sense, i changed the code to be php -d max_input_time=0 -q /path/to/file.php and the code ran without erroring. Unfortunately, its still running 10 minutes later. At least this proves that the issue is with max_input_time though
I'm surprised that no one above has actually timed the completed exec call. The problem is that exec(x) is taking a much longer time than command line x. I have a very complex perl script (with 8 levels of internal recursion) that takes about 40 sec to execute from the command line. Using exec inside a php script to call the same perl program takes about 300 sec to execute, i.e., a factor of about 7X longer time. This is such an unexpected effect that people aren't increasing their max execution time sufficiently to see their programs complete. As a result, they are mystified by the timeout. (BTW, I am running on WAMP in a fast machine with nominally 8 cpus, and the rest of my php program is essentially trivial, so the time difference must be completely in the exec.)
create wrapper.sh file as below
export DISPLAY=:0<br>
xhost + 2>>/var/www/err.log<br>
/usr/bin/php "/var/www/read_sms1.php" 2>>/var/www/err.log<br>
and put it in cron as below
bash /var/www/wrapper.sh<br>
y read_sms1.php contain<br>
$ping_ex = exec("/usr/local/bin/gnokii --getsms SM 1 end ", $exec_result, $pr);
and above solution workedfine for me in ubuntu 12.04