every minute cron job sometimes not running - php

I have several cron jobs executing php scripts. The php scripts sometimes do some heavy jobs like updating hundreds of records at a time in a mysql table.
The problem is that the job should run every minute. However, it randomly misses and as a result does not execute every minute. Sometimes, it executes every 4-6 minutes, then back to every 1 minute, misses 2-3 times more and then normal again.
I am on centos 6.5
Please note that the php runs correctly and there is no problem whatsoever concerning the php scripts themselves since on the time it runs, I get the expected results and that there are about 10 more other similar scripts running at the same time (every minute or every 5 minutes for the other scripts)
Job:
/usr/bin/php "/var/www/html/phpScriptToExecute.php" >> /var/www/html/log/phpScriptLog.log 2>&1
My take is that it is maybe a problem with too many simultaneous scripts running concurrently, accessing the database at the same time.
Last information: No error in the /var/log/cron file or in the phpScriptLog.log file.

The reason could be, your cron job takes more than 1 minute to execute, print out the start time and end time at the end of the script to validate it.
if the cron job is running, linux won't execute it again.

My guess is it's caused b a PHP fatal error, but your PHP probably isn't configured to send error messages to stderr, which is why your phpScriptLog.log is empty. You could check your php.ini (or just use ini_set()) for the following:
display_errors: set it to on/true if you want errors to show on stderr
log_errors: set it to true if you want to send the error messages to a file
error_log: point it to a file you want the errors to be stored in
Or, if you want a solution to avoid overlapping cron jobs, there are plenty here in SO.

Related

CRON script timing out on AWS ubuntu -- php.ini files look good, where else should I check? (timeout)

I have a cron script which calls a PHP file every minute. That file checks to see if any actions need to be taken and, if so, takes those actions. When the execution time is long, however, this script doesn't always finish. I'm not seeing any error codes, and my built-in logging simply stops, since it only runs with the file being executed.
The thing is: the amount of time varies over a dozen seconds in range, so I'm not sure if it's 300 seconds to my timeout when sometimes the script seems to quit after just 280.
This is being run on an AWS ubuntu server.
Is there a list of all the places I might check for errors or timeouts? The two php.ini files are both set to a half-hour, and my script is not running for nearly that long.

Use PHP and Windows Task Scheduler to try again on exit(1)

I use windows task scheduler to start up a php script that works perfectly fine. Basically C:\php.exe -f C:\myscript.php
In my script some work happens that sometimes makes me want to run the task script again in 5 minutes.
I tried to implement this by changing the settings of the task to restart every 5 minutes if the task fails and having my php code exit(1). The task scheduler seems to know that I exited with an error code of 1, but it does not run the script again.
Does anyone know what I can do to make it so that task manager will try again in 5 minutes if I signal it from my code somehow.
Not an answer to the question as phrased, but might serve as a fallback if you can't get it working: make your job run every 5 minutes, regardless, and then track "last success"/"last failure" yourself, in a database or file.
Before doing anything else, the script can check the logged status, and if there was a failure last time, try again (up to a limited number of tries, presumably). If there was a success last time, exit immediately, unless it's time for the next job anyway (e.g. if the original schedule was daily, then check for $last_success being longer ago than 24 hours).

running multiple instances of cron at the same time

I have set of some entries in my database to run some scripts in php. So i will run a cron file every few mins to see if for next few mins i have some scripts to run, i will trigger them, what i am wondering what happens if i have a script to be executed at 12:10 which will execute in 5 mins, what happens if i have another request to run the same script at 12:12 while the execution of the first script is not finished yet. Is the file locked till the execution is done? or what?
Its the same file to be executed in different timings.
I think i can do this with the cron tab, but i dont prefer that.
They will have no problems running simultaneously but bear in mind you could have issues if they will be writing to the same file/database table.
It will run the script twice, thrice or as many times it likes. If you only want to run this script one at a time, you can create a lock file and then check to see if it exists. Example:
if(file_exists('/tmp/script.lock')){
exit();
}
file_put_contents('/tmp/script.lock','');
// execute code
unlink('/tmp/script.lock');

Heavy CRON Tasks

I have to run a pretty heavy task on PHP once a week (script that curls to various locations (websites, API's), gathers, sorts data and inserts it into a db). The whole script takes about 10 to 15 mintues to run on my mac (localhost) - guessing it'll run a bit faster on a server. Nevertheless - I'm currently looping through with AJAX, so when each task is finished, next one is launched. Now I need to run it weekly, automatically. So I think I can't do it with AJAX Anymore.
Do I have to just set the php.ini to let a script run for 30 mintues or there is a better way to do it ?
The maximum execution time of the PHP script is determined by the amount of time in which no output has been generated. So writing data into STDOUT (e.g. to a logfile) will keep the script running.
However, if you're running the script from command line, the max-execution-time will be defaulted to zero anyway and as already suggested, I'd start the script with a cronjob instead of an AJAX-Request or similar methods. I actually do that for most of my php-scripts performing administrative tasks like synchronizing data across several databases or similar purposes.
php.ini has nothing to do with scheduling jobs. It's simply definining PHP's startup settings. What you want is a cron job, as your title says.
For OSX cron setup, see http://hintsforums.macworld.com/showthread.php?s=&threadid=39005

long time cron job via wget/curl?

I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1

Categories