PHP Has a method called set_time_limit (and a config option max_execution_time), which allows you to specify a time limit which triggers the script to exit when reached. However, this does not take into account time on external calls. Which means that if (for example) the PHP script makes a database call that does something dumb and runs for hours, the set time limit will not trigger.
If I want to guarantee that my PHP script's execution time will not exceed a specific time, is there some way to do this? I'm running the script directly in PHP (rather than via apache or similar), so I'm considering writing a bash script to monitor ps and kill it after a certain time. It occurs to me that I can't be the first person with this problem though, so is there some already-built solution available for this?
Note: This question is NOT about the PHP error from exceeding it's maxed execution time. This is about the case where PHP exceeds a time limit without triggering this error. This question is NOT just about limiting the time of a PHP script, it is about limiting the time allowed to a PHP script making external calls.
When running PHP from the command line the default setting "max_execution_time" is 0.
You can try something similar to solve your problem.
$cmd = "(sleep 10 && kill -9 ".getmypid().") > /dev/null &";
exec($cmd); //issue a command to force kill this process in 10 seconds
$i=0;
while(1){
echo $i++."\n";
sleep(1); //simulating a query
}
Related
I have a rails app which initiates php script using:
php -f aa.php
The php script has at the beginning
set_time_limit(60); // 1 minute
ini_set('max_execution_time',60);
But the php script does not die even after two minutes.
I verified using ps -ef | grep php and the process is still running.
Does it not die because the parent ( rails app) is alive ?
I have verified the safe_mode is off for php
Okay, see this Note from php.net, the script won't stop because of set_time_limit as long as there is f.e. a DB query
NOTE:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
You might set a timeout function in your loop so that it stops after a minute or whatever time you wish
I have a php-cli script that is run by cron every 5 minutes. Because this interval is short, multiple processes are run at the same time. That's not what I want, since this script has to write inside a text file a numeric id that is incremented each time. It happens that writers are writing at the same time on this text file, and the value written is incorrect.
I have tried to use php's flock function to block writing in the file, when another process is writing on it but it doesnt work.
$fw = fopen($path, 'r+');
if (flock($fw, LOCK_EX)) {
ftruncate($fw, 0);
fwrite($fw, $latestid);
fflush($fw);
flock($fw, LOCK_UN);
}
fclose($fw);
So I suppose that the solution to this is create a bash script that verifies if there is an instance of this php script that is running, if so it should wait until it finished. But I dont know how to do it, any ideas?
The solution I'm using with a bash script is this:
exec 9>/path/to/lock/file
if ! flock -n 9 ; then
echo "another instance is running";
exit 1
fi
# this now runs under the lock until 9 is closed (it will be closed automatically when the script ends)
A file descriptor 9> is created in /var/lock/file, and flock will exit a new process that's trying to run, unless there is no other instance of the script that is running.
How can I ensure that only one instance of a script is running at a time (mutual exclusion)?
I don't really understand how incrementing a counter every 5 minutes will result in multiple processes trying to write the counter file at the same time, but...
A much simpler approach is to use a simple locking mechanism similar to the below:
<?php
$lock_filename = 'nobodyshouldincrementthecounterwhenthisfileishere';
if(file_exists($lock_filename)) {
return;
}
touch($lock_filename);
// your stuff...
unlink($lock_filename);
This as a simple approach will not deal with a situation when the script breaks before it could remove the lock file, in which case it would never run again until it is removed.
More sophisticated approaches are also possible as you suggest, e.g. fork the job in its own process, write the PID into a file, then before running the job it could be checked whether that PID is still running.
To prevent start of a next session of any program until the previous session still running, such as next cron job, I recommend to use either built into your program or external check of running process of this program. Just execute before starting of your program
ps -ef|grep <process_name>|grep -v grep|wc -l
and check, if its result will be 0. Only in this case your program could be started.
I suppose, that you must guarantee an absence of 3rd party process having similar name. (For this purpose give your program a longer and unique name). And a name of your program must not contain pattern "grep".
This work good in combination with normal regular starting of your program, that is configured in a cron table, by cron daemon.
For the case if your check is written as an external script, an entry in the crontab might look like
<time_specification> <your_starter_script> <your_program> ...
2 important remarks: Exit code of your_starter_script must be 0 in case of not starting of your program and it would be better to completely prohibit writing to stdout or stderr by this script.
Such starter is very short and a simple programming exercise. Therefore I don't feel a need to provide its complete code.
Instead of using cron to run your script every 5 minutes, how about using at to schedule your script to run again, 5 minutes after it finishes. Near the end of your script, you can use shell_exec() to run an at command to schedule your script to run again in 5 minutes, like so:
at now + 5 minutes /path/to/script
Or, perhaps even simpler than my previous answer (using at to schedule the script to run again in 5 minutes) is make your script a daemon, by using a non-terminating loop, like so:
while(1) {
// whatever your script does here....
sleep(300) //wait 5 minutes
}
Then, you can do away with scheduling by way of cron or at altogether. Just simply run your script in the background from the command line, like so:
/path/to/your/script &
Or, add /path/to/your/script in /etc/rc.local to make your script start automatically when the machine boots.
I have an issue which stops my (slow) process. I start my background slow process using a php page with a button as follow:
<form id="trial" method="post" action=""><input name="trial" value="Start!" type="submit">
<?php
set_time_limit(0);
if (isset($_POST['trial'])) {
system("/srv/www/cgi-bin/myscript.sh");
}
?>
At some point after 1.5 days the process stops, I have modified the php.ini and the apache config file inserting a very high number in the timeout directive, but it seems it does not work, or there is some other process that is stopping myscript.sh.. do you have any suggestions?
thanks!
I'm assuming you have access to the server via SSH based on your post.
If the real goal is to get your script to run continuously, why not log in and
nohup myscript.sh
As long as your script behaves, it will continue to run as long as it needs to after you close the terminal.
Check the Logs
To determine why your script is failing, you'll definitely want to check /var/log/kern.log and /var/log/syslog. Look for any entries containing your script or any of it's children. Your script may be getting killed off by the kernel ( exceeding limits ) or erroring out at runtime.
Execute the script continuously will take some problem so set Cron for every 30 mins in your system.
set_time_limit(30);
system("/srv/www/cgi-bin/myscript.sh");
Cron setup :
30 * * * * php /path/to/your/php/file.php
I have a command that when run direct on the command line works as expected. It runs for over 30 seconds and does not throw any errors. When the same command is called through a PHP script through the php function exec() (which is contained in a script called by a cron) it throws the following error:
Maximum execution time of 30 seconds
exceeded
We have a number of servers and i have run this command on a very similar server with the exact same dataset without any issues so i'm happy there is no script-level issue. I'm becoming more inclined to think this is related to something at the server level - either in the PHP setup or the server setup in some way but really not sure where to look. For those that are interested both servers have a max execution time of 30 seconds.
the command itself is called like this -
from command line as:
root#server>php -q /path/to/file.php
this works...
and via cron within a PHP file as:
exec("php -q /path/to/file.php");
this throws the max execution time error. it was always my understanding that there was no execution time limit when PHP is run from the command line.
I should point out that the script that is called, calls a number of other scripts and it is one of these scripts that is erroring. Looking at my logs, the max execution time error actually occurs before 30 seconds has even elapsed too! So, less than 30 seconds after being called, a script, called by a cron script that appears to be running as CLI is throwing a max execution error.
To check that the script is running as i expected (as CLI with no max execution time) i performed the following check:
A PHP script containing this code:
// test.php
echo exec("php test2.php");
where test2.php contains:
echo ini_get('max_execution_time');
and this script is run like this:
root#server> php test.php
// returns 0
This proves a script called in this way is running under CLI with a max execution time of 0 which just proves my thoughts, i really cannot see why this script is failing on max execution time!
it seems that your script takes too much time to execute, try to
set time limit, http://php.net/manual/en/function.set-time-limit.php
or check this post:
Asynchronous shell exec in PHP
Does the command take over 30 seconds on the command line? Have you tried increased the execution timeout in the php.ini?
You can temporarily set the timeout by including this at the top of the script. This will not work when running in safe mode as is specified in the documents for setting max_execution_time with ini_set().
<?php
ini_set('max_execution_time', 60); // Set to be longer than
// 60 seconds if needed
// Rest of script...
?>
One thing of note in the docs is this:
When running PHP from the command line
the default setting is 0.
What does php -v | grep cli, run from both the shell and in the exec command from the cron-loaded php file show?
Does explictly typing /usr/bin/php (modify as appropriate) make any difference?
I've actually found what the issue is (kinda). It seems that its maybe a bug with PHP reporting max_execution_time to be exceeded when the error is actually with max_input_time as described here
I tried changing the exec call to php -d max_execution_time=0 -q /path/to/file.php and i got the error "Maximum execution time of 0 seconds exceeded" which makes no sense, i changed the code to be php -d max_input_time=0 -q /path/to/file.php and the code ran without erroring. Unfortunately, its still running 10 minutes later. At least this proves that the issue is with max_input_time though
I'm surprised that no one above has actually timed the completed exec call. The problem is that exec(x) is taking a much longer time than command line x. I have a very complex perl script (with 8 levels of internal recursion) that takes about 40 sec to execute from the command line. Using exec inside a php script to call the same perl program takes about 300 sec to execute, i.e., a factor of about 7X longer time. This is such an unexpected effect that people aren't increasing their max execution time sufficiently to see their programs complete. As a result, they are mystified by the timeout. (BTW, I am running on WAMP in a fast machine with nominally 8 cpus, and the rest of my php program is essentially trivial, so the time difference must be completely in the exec.)
create wrapper.sh file as below
export DISPLAY=:0<br>
xhost + 2>>/var/www/err.log<br>
/usr/bin/php "/var/www/read_sms1.php" 2>>/var/www/err.log<br>
and put it in cron as below
bash /var/www/wrapper.sh<br>
y read_sms1.php contain<br>
$ping_ex = exec("/usr/local/bin/gnokii --getsms SM 1 end ", $exec_result, $pr);
and above solution workedfine for me in ubuntu 12.04
I'm building a spider which will traverse various sites and data mining them.
Since I need to get each page separately this could take a VERY long time (maybe 100 pages).
I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter.
This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface.
It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX.
My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies.
Maybe I could use system('script.php &') but I'm not sure it will do the trick.
Any other ideas?
$cmd = "php myscript.php $params > /dev/null 2>/dev/null &";
# when we call this particular command, the rest of the script
# will keep executing, not waiting for a response
shell_exec($cmd);
What this does is sends all the STDOUT and STDERR to /dev/null, and your script keeps executing. Even if the 'parent' script finishes before myscript.php, myscript.php will finish executing.
if you don't want to use exec you can use a php built in function !
ignore_user_abort(true);
this will tell the script to resume even if the connection between the browser and the server is dropped ;)