Execution time in CLI and included files - php

I'm trying to run php script from CLI like this:
php -q /var/www/script.php
As i know if you run it from CLI there is no max_execution_time, but if i use functions from required/included files, after 5-10 minutes i have fatal error:
PHP Fatal error: Maximum execution time of 60 seconds exceeded in
/var/www/include.php on line 10
So max_execution_time does not apply to included files?
It is possible to avoid this without adding set_time_limit(0) in every included file?

Perhaps one of the included files set (for whatever reason) a set_time_limit(60)? If that's the case, you could probably work around it by calling set_time_limit(0) after every include in your PHP CLI script... Or edit the files contaning the set_time_limit(60), which might of course lead to unwanted side effects...

Try this?
php -d max_execution_time=5 script.php

Related

How to exit script if external call exceeds time limit

PHP Has a method called set_time_limit (and a config option max_execution_time), which allows you to specify a time limit which triggers the script to exit when reached. However, this does not take into account time on external calls. Which means that if (for example) the PHP script makes a database call that does something dumb and runs for hours, the set time limit will not trigger.
If I want to guarantee that my PHP script's execution time will not exceed a specific time, is there some way to do this? I'm running the script directly in PHP (rather than via apache or similar), so I'm considering writing a bash script to monitor ps and kill it after a certain time. It occurs to me that I can't be the first person with this problem though, so is there some already-built solution available for this?
Note: This question is NOT about the PHP error from exceeding it's maxed execution time. This is about the case where PHP exceeds a time limit without triggering this error. This question is NOT just about limiting the time of a PHP script, it is about limiting the time allowed to a PHP script making external calls.
When running PHP from the command line the default setting "max_execution_time" is 0.
You can try something similar to solve your problem.
$cmd = "(sleep 10 && kill -9 ".getmypid().") > /dev/null &";
exec($cmd); //issue a command to force kill this process in 10 seconds
$i=0;
while(1){
echo $i++."\n";
sleep(1); //simulating a query
}

Server terminating the process before complete the full execution in PHP

I am trying to run a php file through cron job. It starts the running well but after a certain time period,the server is terminating it's execution.
So that I can't get my desired output. I am taking the output in a text file. After start the cron, it store some output into the text file but before completing full execution, it is terminating the process.
I also called mail function at beginning and ending of the file. But I only got the beginning message.
I set the max_time, max_memory to the infinite and also checked the settings from php_info().
Everything is ok there but file is not completing its execution successfully. I am able to run the file through browser but it requires a very long time.
So I must do it with others way like cron. If any one provide me a better solution in this regard, I will be grateful.
Turn off safe_mode in php.ini and change maximum execution time by adding
set_time_limit(0)
At the beginning of your script
php set_time_limit function

limit execution time of shell_exec

I have an exe file which has the following code:
while(1)
printf("hello\n");
I'm executing this exe through php using shell_exec
$output = shell_exec('C:/Users/thekosmix/Desktop/hello.exe 2>&1');
echo $output;
now the script is executing for very long time untill i kill the process from task manager and it gives fatal error:
(Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 133693440 bytes) in C:\xampp\htdocs\shell\index.php on line 7)
I want the script (or this function) to run for a given time duration and print whatever output is generated during the time-duration not any fatal error. set_time_limit() is also not solving the problem.
You can't do this from within PHP - set_time_limit() is only checked after each PHP statement gets executed. Under linux you'd use ulimit, but it looks like you're using Windows.
You'll need to either modify the hello.exe executable to build in a timeout, or write a wrapper that you call from PHP, which call hello.exe and handles the timeout. In any language that can easily fork, this is trivial.
To sum it up: do the following, if you use a Linux server:
shell_exec('ulimit -t 120 && ' . $path_to_program);
This terminates after two minutes of cpu-usage, which is a lot more than 2 minutes real time depending on the resources hunger of your program.
You could alter the script as so
while (shouldKeepRunning())
shouldKeepRunning could, in its most crude form, check for whether a certain file exists to determine if it should keep running. Then you just create or delete a file to shut the script down gracefully. On linux, you could use a signal handler for this, but not windows. Anyway, you get the point.
you could use ticks if the outer while loop isnt granular enough without massive code modifcation. http://php.net/manual/en/control-structures.declare.php

PHP CLI script not timing out

We have a node js script that runs a command to execute the following command:
/usr/local/bin/php -q /home/www/441.php {"id":"325241"}
This script does a lot of things, however it does not seem to respect the time limit. The first line of this file is:
set_time_limit(1800);
Yet if we check what processes are running on the server (ps -aux | grep php) we will see a lot of these commands that have been open since last week.
Any ideas on how we can clean this up?
I found the following comment on the PHP user guide for max_execution_time
Keep in mind that for CLI SAPI
max_execution_time is hardcoded to 0.
So it seems to be changed by ini_set
or set_time_limit but it isn't,
actually. The only references I've
found to this strange decision are
deep in bugtracker
(http://bugs.php.net/37306) and in
php.ini (comments for
'max_execution_time' directive).
So it would seem that there's a bug in the CLI module that means max_execution_time is effectively ignored.
The commenter mentioned a page in the bug tracker about this at http://bugs.php.net/37306 but the tracker seems to be down.
set_time_limit only has meaning to the php part of the program. If you had a query on a database that takes 5h to finish, those 5h are not counted by php, so they fall out of scope of the set_time_limit limitation. Having said that, it seems weird that a php process is still running after a week, if it is not calling another program that runs forever (which, in this case, the set_time_limit neither affects that calling).
Also, what does the -q flag? I can't find it on man php nor php --help nor in php's command line options.
If you start the script in nodejs, why not kill it there too, after 1800s?
var pid = startPHPProcess();
setTimeout(function() {
killPHPProcess(pid);
}, 1800);

Shell command works on command line but not in PHP exec

I have a command that when run direct on the command line works as expected. It runs for over 30 seconds and does not throw any errors. When the same command is called through a PHP script through the php function exec() (which is contained in a script called by a cron) it throws the following error:
Maximum execution time of 30 seconds
exceeded
We have a number of servers and i have run this command on a very similar server with the exact same dataset without any issues so i'm happy there is no script-level issue. I'm becoming more inclined to think this is related to something at the server level - either in the PHP setup or the server setup in some way but really not sure where to look. For those that are interested both servers have a max execution time of 30 seconds.
the command itself is called like this -
from command line as:
root#server>php -q /path/to/file.php
this works...
and via cron within a PHP file as:
exec("php -q /path/to/file.php");
this throws the max execution time error. it was always my understanding that there was no execution time limit when PHP is run from the command line.
I should point out that the script that is called, calls a number of other scripts and it is one of these scripts that is erroring. Looking at my logs, the max execution time error actually occurs before 30 seconds has even elapsed too! So, less than 30 seconds after being called, a script, called by a cron script that appears to be running as CLI is throwing a max execution error.
To check that the script is running as i expected (as CLI with no max execution time) i performed the following check:
A PHP script containing this code:
// test.php
echo exec("php test2.php");
where test2.php contains:
echo ini_get('max_execution_time');
and this script is run like this:
root#server> php test.php
// returns 0
This proves a script called in this way is running under CLI with a max execution time of 0 which just proves my thoughts, i really cannot see why this script is failing on max execution time!
it seems that your script takes too much time to execute, try to
set time limit, http://php.net/manual/en/function.set-time-limit.php
or check this post:
Asynchronous shell exec in PHP
Does the command take over 30 seconds on the command line? Have you tried increased the execution timeout in the php.ini?
You can temporarily set the timeout by including this at the top of the script. This will not work when running in safe mode as is specified in the documents for setting max_execution_time with ini_set().
<?php
ini_set('max_execution_time', 60); // Set to be longer than
// 60 seconds if needed
// Rest of script...
?>
One thing of note in the docs is this:
When running PHP from the command line
the default setting is 0.
What does php -v | grep cli, run from both the shell and in the exec command from the cron-loaded php file show?
Does explictly typing /usr/bin/php (modify as appropriate) make any difference?
I've actually found what the issue is (kinda). It seems that its maybe a bug with PHP reporting max_execution_time to be exceeded when the error is actually with max_input_time as described here
I tried changing the exec call to php -d max_execution_time=0 -q /path/to/file.php and i got the error "Maximum execution time of 0 seconds exceeded" which makes no sense, i changed the code to be php -d max_input_time=0 -q /path/to/file.php and the code ran without erroring. Unfortunately, its still running 10 minutes later. At least this proves that the issue is with max_input_time though
I'm surprised that no one above has actually timed the completed exec call. The problem is that exec(x) is taking a much longer time than command line x. I have a very complex perl script (with 8 levels of internal recursion) that takes about 40 sec to execute from the command line. Using exec inside a php script to call the same perl program takes about 300 sec to execute, i.e., a factor of about 7X longer time. This is such an unexpected effect that people aren't increasing their max execution time sufficiently to see their programs complete. As a result, they are mystified by the timeout. (BTW, I am running on WAMP in a fast machine with nominally 8 cpus, and the rest of my php program is essentially trivial, so the time difference must be completely in the exec.)
create wrapper.sh file as below
export DISPLAY=:0<br>
xhost + 2>>/var/www/err.log<br>
/usr/bin/php "/var/www/read_sms1.php" 2>>/var/www/err.log<br>
and put it in cron as below
bash /var/www/wrapper.sh<br>
y read_sms1.php contain<br>
$ping_ex = exec("/usr/local/bin/gnokii --getsms SM 1 end ", $exec_result, $pr);
and above solution workedfine for me in ubuntu 12.04

Categories