running multiple instances of cron at the same time - php

I have set of some entries in my database to run some scripts in php. So i will run a cron file every few mins to see if for next few mins i have some scripts to run, i will trigger them, what i am wondering what happens if i have a script to be executed at 12:10 which will execute in 5 mins, what happens if i have another request to run the same script at 12:12 while the execution of the first script is not finished yet. Is the file locked till the execution is done? or what?
Its the same file to be executed in different timings.
I think i can do this with the cron tab, but i dont prefer that.

They will have no problems running simultaneously but bear in mind you could have issues if they will be writing to the same file/database table.

It will run the script twice, thrice or as many times it likes. If you only want to run this script one at a time, you can create a lock file and then check to see if it exists. Example:
if(file_exists('/tmp/script.lock')){
exit();
}
file_put_contents('/tmp/script.lock','');
// execute code
unlink('/tmp/script.lock');

Related

every minute cron job sometimes not running

I have several cron jobs executing php scripts. The php scripts sometimes do some heavy jobs like updating hundreds of records at a time in a mysql table.
The problem is that the job should run every minute. However, it randomly misses and as a result does not execute every minute. Sometimes, it executes every 4-6 minutes, then back to every 1 minute, misses 2-3 times more and then normal again.
I am on centos 6.5
Please note that the php runs correctly and there is no problem whatsoever concerning the php scripts themselves since on the time it runs, I get the expected results and that there are about 10 more other similar scripts running at the same time (every minute or every 5 minutes for the other scripts)
Job:
/usr/bin/php "/var/www/html/phpScriptToExecute.php" >> /var/www/html/log/phpScriptLog.log 2>&1
My take is that it is maybe a problem with too many simultaneous scripts running concurrently, accessing the database at the same time.
Last information: No error in the /var/log/cron file or in the phpScriptLog.log file.
The reason could be, your cron job takes more than 1 minute to execute, print out the start time and end time at the end of the script to validate it.
if the cron job is running, linux won't execute it again.
My guess is it's caused b a PHP fatal error, but your PHP probably isn't configured to send error messages to stderr, which is why your phpScriptLog.log is empty. You could check your php.ini (or just use ini_set()) for the following:
display_errors: set it to on/true if you want errors to show on stderr
log_errors: set it to true if you want to send the error messages to a file
error_log: point it to a file you want the errors to be stored in
Or, if you want a solution to avoid overlapping cron jobs, there are plenty here in SO.

Creating a File-Based Cron Job in Linux

I am trying to do what I think is a pretty complicated task.
I would like to execute a PHP script, but only if a file named import_products.csv exists in /var/import. The reason why is because that script attempts to import the aforementioned CSV file, and therefore I don't want it to error if it attempts to run the script every minute.
On the other hand, I DO want the script to run every minute (or even 2 or 4 times per minute) if there IS a file. So what I am invisioning is something like this.
I'd create the cron job to run every minute, which does the following:
Check if /var/import/import_products.csv exists
If it does not exist, exit the script now.
Execute product_import.php (this would be stored in the same folder as my cronjob file)
After execution completes, delete /var/import/import_products.csv
I will be using a separate daemon to generate these CSV files, all that is imperative here is that it only runs the PHP script if the file exists.
If there is a way to do this without using cron, that would work too. Basically I just want it to run that PHP script every time a new import_products.csv is written and delete the file afterwards. Cron jobs are the only way I know for sure to do this.
There are a couple of ways I could envision you doing this. The first is, if possible, the easiest, which would be to add in checks to the PHP script itself to see whether or not the file is present. If your product imports will take longer than one minute you'll also have to consider what happens if the file is still there while another import is happening already.
The second way would be to create a bash or shell script of some kind that will check for the existence of the file and then run the command to execute the PHP script if so, then add the shell script to your cron instead of the PHP script itself.
You can include the file checking within the php script through Exception handling, with a little php side overhead.

Run script only once for 5 minutes (Linux)

I have a php script that I'd like to run for a specific amount of time (e.g. 5 minutes), but only once. With cron jobs this will run indefinitely. Is there another way?
The way to handle this is:
When something triggers to need for the cron job to run, put a flag somewhere that the cron job will read.
Have cron job that runs all the time asking "do I need to run?". It checks hte flag. If it sees it, it deletes the flag and runs for specified time. If not, just wait until it next runs (to check for the flag)
In the actual cron job, set "max_execution_time" to 5 minutes. Then it shouod stop at (or just over) 5 minutes. To be more precise, include a timer in your script running in a loop, if you can.

Setup cron job from php script to run another php script?

Is it possible to setup a cron job from within a PHP script on a LAMP server to run another PHP script? I'm a newb with cron jobs but I do know php quite well. Do I have to access the shell from PHP to set up the cron job? How would I code that if, for example, I want to execute execute.php automatically in 5 minutes after it is called from call.php?
Edit: Just to be clear, I only want to run execute.php once 5 minutes after it is called, and not have it repeat daily or anything like that. So, after it is executed by the cron job, the cron job should be discarded.
Cron doesn't work exactly like that, but you can set something up to create the functionality you want.
I would first set up a cron entry to execute execute.php every minute.
Then, when call.php is run, call.php makes an entry in a database table or flat file with the time that execute.php should be called.
When the cron entry is run, execute checks the database table or flat file to see if it's supposed to run the code in the file at that time, and if it is, runs it.
Use sleep at the beginning of execute.php
sleep(5*60);
//Rest of the code
It should by called like this:
require_once("execute.php");
However, call.php will not send any response for 5 minutes

Does a cron job kill last cron execution?

I have a cron job the executes a PHP script. The cron is setup to run every minute, this is done only for testing purposes. The PHP script it is executing is designed to convert videos uploaded to the server by users to a flash format (eg... .flv). The script executes fine when manually doing it via command line, however when executing via cron it starts fine but after one minute it just stops.
It seems that when the next cron is executed it "kills" the last cron execution.
I added the following PHP function:
ignore_user_abort(true);
In hopes that it would not abort the last execution, I tested setting the cron to run every 5 minutes, which works fine, however a conversion of a video may take over 5 minutes so I need to figure out why its stoping when another cron is executed.
Any help would be appreciated.
Thank you!
EDIT:
My cron looks like:
*/1 * * * * php /path_to_file/convert.php
I don't think cron kills any processes. However, cron isn't really suitable for long running processes. What may be happening here is that your script tramples all over itself when it is executed multiple times. For example, both PHP processes may be trying to write to the same file at the same time.
First, make sure you not only look in the php error log but also try to capture output from the PHP file itself. E.g:
*/1 * * * * * php /path/to/convert.php & >> /var/log/convert.log
You could also use a simplistic lockfile to ensure that convert.php isn't executed multiple times. Something like:
if (file_exists('/tmp/convert.lock')) {
exit();
}
touch('/tmp/convert.lock');
// convert here
unlink('/tmp/convert.lock');
cron itself won't stop a previous instance of a job running so, if there's a problem, there's almost certainly something in your PHP doing it. You'll need to post that code.
No, it will not. You can keep a second process from running by creating a lock file that the script checks for on each run. If the file exists, it does not run. This should also, if appropriate, be used in conjunction with a maximum execution time so that one process does not stall future executions indefinitely. The lock file can just be an empty plain text file called /tmp/foo.lock.

Categories