PHP Cron job execution time limits fail - php

I've got a CronJob scheduled to run every 7 minutes. The script runs a loop over a number of users and sends email via SMTP or does some API calls via curl.
Thus, most of the execution time is apparently spent outside the realm tracked by max_execution_time on Linux. So, because I was experiencing hangs in my script that were always fixed by restarting it (I'm also looking for the cause of the hangs, but wasn't successful so far).
Because with set_time_limit set to 6 minutes, the script still ran 30 minutes sometimes, I now check microtime(true) after each round in the loop and break out of it, if it has been running more than 6 minutes.
Still, the script sometimes runs 37 minutes (even though I can see that emails, which map to one round in a loop, still go off).
The only trick I have left is pcntl_fork. I am reluctant to employ it because of platform dependence, and because I figured using a loop and microtime(true) should track time spent outside the process too and I'd like to understand why this isn't the case here.

The max_execution_time is used for limiting script execution time. However this does not affect system calls. See the manpage.
You can also use set_time_limit() to properly set the maximum amount of time a script can run. This only changes the time_limit for the script in scope. Have you tried this as well?
You can find another question that has been asked may help slightly at this link.

I ended up going outside PHP to get the time, this is simple, works and doesn't require me to learn to deal with pcntl functions.
$time_comp = microtime(true); // save start time
function check_time_against_mysql($dbh,$time_comp) // run this function
{ // to see how much time passed
$time = $dbh->prepare("SELECT UNIX_TIMESTAMP() - :created AS time_in_seconds");
$time->bindValue(":created",$time_comp);
$time->execute() or die("fail time");
$time_passed = $time->fetch(PDO::FETCH_ASSOC);
return floatval($time_passed['time_in_seconds']);
}

Related

Should I avoid a one hour sleep()?

I'm using the DigitalOcean API to create droplets when my web application needs the extra resources. Because of the way DigitalOcean charges (min. one hour increments), I'd like to keep a created server on for an hour so it's available for new tasks after the initial task is completed.
I'm thinking about formatting my script this way:
<?php
createDroplet($dropletid);
$time = time();
// run resource heavy task
sleep($time + 3599);
deleteDroplet($dropletid);
Is this the best way to achieve this?
It doesn't look like a good idea, but the code is so simple, nothing can compete with that. You would need to make sure your script can run, at least, for that long.
Note that sleep() should not have $time as an argument. It sleeps for the given number of seconds. $time contains many, many seconds.
I am worried that the script might get interrupted, and you will never delete the droplet. Not much you can do about that, given this script.
Also, the sleep() itself might get interrupted, causing it to sleep much shorter than you want. A better sleep would be:
$remainingTime = 3590;
do {
$remainingTime = sleep($remainingTime);
} while ($remainingTime > 0);
This will catch an interrupt of sleep(). This is under the assumption that FALSE is -1. See the manual on sleep(): http://php.net/manual/en/function.sleep.php
Then there's the problem that you want to sleep for exactly 3599 seconds, so that you're only charged one hour. I wouldn't make it so close to one hour. You have to leave some time for DigitalOcean to execute stuff and log the time. I would start with 3590 seconds and make sure that always works.
Finally: What are the alternatives? Clearly this could be a cron job. How would that work? Suppose you execute a PHP script every minute, and you have a database entry that tells you which resource to allocate at a certain start time and deallocate at a certain expire time. Then that script could do this for you with an accurary of about a minute, which should be enough. Even if the server crashes and restarts, as long as the database is intact and the script runs again, everything should go as planned. I know, this is far more work to implement, but it is the better way to do it.

prevent php timeout in block of code

Wanted to know if there was a was to prevent a php timeout of occurring if a part of the code has started being process.
Let me explain:
i have a script that is executed that take way too long to even use
ini_set('max_execution_time', 0);
set_time_limit(0);
the code is built to allow it to timeout and restart where it was but I have 2 line of code that need to be executed together for that to happen
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
is there a way in php to tell it it has to finish executing them both even if the timeout is called?
Got an idea as I'm writing this, i could use a time_out of 120 sec and start a timer and if there is less then 20 sec left before timeout to stop, i just wanted to know if i was missing something.
Thank you for your inputs.
If your code is not synchronous and some task takes more than 100 seconds - you'll not be able to check the execution time.
I see only one truly HACK (be careful, test it with php -f in console for be able to kill the processes):
<?php
// Any preparations here
register_shutdown_function(function(){
if (error_get_last()) { // There was timeout exceeded error
// Call the rest of your system
// But note: you have no stack, no context, no valid previous frame - nothing!
}
});
One thing you could do is use the DATE time features to monitor your average execution time. (kind of builds up with each execution assuming you have a loop).
If the average then time is then longer than how ever much time you have left (you would be counting how much time has been taken already against your maximum execution time), you would trigger a restart and let it pick up from where it left off.
How ever if you are experiencing time outs, you might want to look at ways to make your code more efficient.
No, you can't abort timeout handler, but i'd say that 20 seconds is quite a big time if you're not parsing something huge. However, you can do the following:
Get time of the execution start ($start = $_SERVER['REQUEST_TIME'] or just $start = microtime(true); in the beginning of your controller).
Asset that execution time is lesser than 100 seconds before running $email->save() or halt/skip code if necessary. This is as easy as if (microtime(true) - $start < 100) { $email->save()... }. You would like more abstraction on this, however.
(if necessary) check execution time after both methods have ran and halt execution if it has timed out.
This will require time_limit set to big value or even turned off and tedious work to prevent too long execution. In most cases timeout is your friend that just tells you're work is taking too much time and you should rethink your architecture and inner processes; if you're out of 120 seconds, you'll probably want to put that work on a daemon.
Thank you for your input, as I thought the timer solution is the best way.
what I ended up doing was the following, this is not the actual code as its too long to make a good answer but just the general idea.
ini_set('max_execution_time', 180);
set_time_limit(180);
$startTime = date('U'); //give me the current timestamps
while(true){
//gather all the data i need for the email
//I know its overkill to break the loop with 1 min remaining
//but i realy dont want to take chances
if(date('U') < ($startTime+120)){
$email->save();
$this->_setDoneWebsiteId($user['id'], $websiteId);
}else{
return false
}
}
I could not use the idea of measuring the average time of each cycle as it vary too much.
I could have made the code more efficient but it number of cycle is based on the number of users and websites in the framework. It should grow big enough to need multiple run to be completed anyway.
Il have to make some research to understand register_shutdown_function, but I will look into it.
Again Thank you!

How do I detect that a PHP CLI script is in a Hung state

I am using supervisor (http://supervisord.org/) to daemonize a fairly standard PHP script. The script is structured something like:
while (1) {
// Do a SQL select
// for any matching rows, do something
// if I have been running for longer than 60 mins, exit
}
Today, this script (which has been fairly stable for some time now), hung. It did not crash (ie issue SIGHUP or SIGTERM signals) which would have alerted supervisord to restart the process. It did not encounter any errors in its processing, which would have either been caught by the script, or at least have triggered a fatal error and exited. Instead of these "catchable" scenarios, it just sat there. We do have a cron job setup to run every hour to restart the script through the supervisorctl hook, because it seems to be generally accepted that PHP scripts are leaky in terms of memory and would do well to be restarted if running long. The script resumed operations normally after that reboot.
My question: how can I detect that this script has hung? I can't even begin to diagnose or troubleshoot this problem of why it has hung, if I am not somehow alerted to that state. I am looking for either a software solution to this, or some approach that I can take to author a solution myself ( in either PHP, Python, perl or shell).
The script is written in PHP 5.2.6, and runs on a uptodate RHEL 5 server.
Please let me know if I can share any additional information if it will help with a more awesome solution.
Thank you!
Shaheeb R.
Since this is a case where the script is hanging, PHP possibly may not process any additional code that could detect this hang. For this reason, I suggest modifying the script to keep a log. This would allow the main script to let anything outside of it know it is still running, and with some well placed updates it can also help pinpoint where things have gone awry.
The logging can be written to a file or database, and should contain at least an indicator of the scripts status, such as a last modified date. If this script is not constantly running, then something should also indicate it is running or has stopped. In the example you gave, the log writing would occur within the while loop at least once, possibly more. It costs time/resources to open the pointers or DB connection, so I recommend logging only what is needed. (Note: If using the text file approach, the file would need to be closed right after each write.)
Example:
while (1) {
log('Running SQL select');
// Do a SQL select
log('Results retrieved');
// for any matching rows, do something
// (check log) if I have been running for longer than 60 mins, exit
}
function log($msg) {
// Write timestamp, $msg to log
}
A separate script would need to check the log and report any errors, which could be problematic if it's affected by what's making the main script hang, but I can't think of an alternative.
In regards to memory, if you are not already using mysql_free_result you should give give it a try.
My suggestion would be similar to what #Shroder described, but taking it a little further. With each run you would create a log/db entry, it would be timestamped + transaction aware (you would update the transaction at start of run to processing and then when done, sign off the entry with completed.
On the side you would run a simple cron check, and see if current time is larger than your trigger (60 minutes, etc) by using the timestamp and transaction state. At that point you throw an alert, etc;
It's quite simple! Just calculate the difference in time from the start of the loop to the current execution point.
$starttime = microtime(true);
while (1)
{
//Do your stuff here
//More SQL, whatever you need
//Put this at the end of the loop
$curtime = microtime(true);
$timetaken = $curtime - $starttime;
if($timetaken > (60 * 60))
{
break;
}
}
microtime(true) will return the seconds since the Unix epoch, so if we subtract the time we start from the current time, we get time taken/elapsed and exit the loop if it's over 60*60 seconds.

Repeat php script after time out

My php script creates thumbnails of images. Sometimes when it handles a lot of images, the script have to run a long time and ends after 60 seconds because of the time limit on my server.
Can I tell the script to time out after 59sek and then repeat itself?
I need some ideas.
Edit:
I don't think my web hosting allows me to change max_execution_time
I can't believe this is my answer..
loopMe.php:
<meta http-equiv="refresh" content="65;url=loopMe.php">
<?php
/* code that is timing out here */
?>
Your best bet though may be to look at set_time_limit() like others have suggested.
Important Note: this loop will never end. You could however have a session variable, say, $_SESSION['stopLoop'] to flag whenever the script successfully ends. Before printing the meta-refresh line you can check the value of that.
If you don't want to modify your code, use the set_time_limit(0), which sets to have no time limit.
set_time_limit(0)
http://php.net/manual/en/function.set-time-limit.php
I wouldn't recommend to use though, if your system grows this would take a very long time processing. It is better to recode in such a way that you run that only when your server is on low traffic and control how much time it will process data.
For instance you could store the data you need to process in a queue and process as much as you can in a time widow once per night.
set_time_limit(0);
yeah as above suggested use set_time_limit It will increase the sript execution timeout.
If your are in loop for processing multiple files then do set_time_limit every time you before you process the file to the expected duration of time. i.e if a file is assumed to take max 60s for execution before processing use set_time_limit(60)
If your server is not runnig with php in safe mode you can use set_time_limit($time); where $time is the time you need to execute the script, for example you can use set_time_limit(240); set the timeout on 4 minutes.
Or for example you can find out in your script how much time has passed and before the timeout expires add some seconds (set_time_limit(10);) until the script finish. (if you call multiple times to set_time_limit() you will add the time of each call.)
Using this code you can calculate the elapsed time and after for each image you compare it against the server timeout.
Also you can modify your script to log the images that has been already processed and in case of timeout carry on with the task in teh same point.
You can run your script from a cron job in your server (if it's allow you) to make it run periodically. Take a look at this article

Avoid PHP execution time limit

I need to create a script in PHP language which performs permutation of numbers. But PHP has an execution time limit set to 60 seconds. How can I run the script so that if you need to run more than 60 sesunde, not been interrupted by the server. I know I can change the maximum execution time limit in php, but I want to hear another version that does not require to know in advance the execution time of a script.
A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.
Any advice is welcome. An example code would be useful.
Thanks.
First I need to enter a number, lets say 25. After this the script is launch and it need to do the following: for every number <= than 25 it will create a file with the numbers generated at the current stage; for the next number it will open the previuos created file, and will create another file base on the lines of the opened file and so on. Because this take to long, I need to avoid the script beeing intrerupted by the server.
#emanuel:
I guess when your friend told you "A friend suggested me to sign in and log out frequently from the server, but I have no idea how to do this.", he/she must have meant "Split your script computation into x pieces of work and run it separately"
For example with this script you can execute it 150 times to achieve a 150! (factorising) and show the result:
// script name: calc.php
<?php
session_start();
if(!isset($_SESSION['times'])){
$_SESSION['times'] = 1;
$_SESSION['result'] = 0;
}elseif($_SESSION['times'] < 150){
$_SESSION['times']++;
$_SESSION['result'] = $_SESSION['result'] * $_SESSION['times'];
header('Location: calc.php');
}elseif($_SESSION['times'] == 150){
echo "The Result is: " . $_SESSION['result'];
die();
}
?>
BTW (#Davmuz), you can only use set_time_limit() function on Apache servers, it's not a valid function on Microsoft IIS servers.
set_time_limit(0)
You could try to put the calls you want to make in a queue, which you serialize to a file (or memory cache?) when an operation is done. Then you could use a CRON-daemon to execute this queue every sixty seconds, so it continues to do the work, and finishes the task.
The drawbacks of this approach are problems with adding to the queue, with file locking and the such, and if you need the results immediately, this can prove troublesome. If you are adding stuff to a Db, it might work out. Also, it is not very efficient.
Use set_time_limit(0) but you have to disable the safe_mode:
http://php.net/manual/en/function.set-time-limit.php
I suggest to use a fixed time (set_time_limit(300)) because if there is a problem in the script (endless loops or memory leaks) this can not be a source of problems.
The web server, like Apache, have also a maximum time limit of 300 seconds, so you have to change it. If you want to do a Comet application, it may be better to chose another web server than Apache that can have long requests times.
If you need a long execution time for a heavy algorithm, you can also implement a parallel processing: http://www.google.com/#q=php+parallel+processing
Or store the input data and computer with another external script with a cron or whatever else.

Categories