Creating a File-Based Cron Job in Linux - php

I am trying to do what I think is a pretty complicated task.
I would like to execute a PHP script, but only if a file named import_products.csv exists in /var/import. The reason why is because that script attempts to import the aforementioned CSV file, and therefore I don't want it to error if it attempts to run the script every minute.
On the other hand, I DO want the script to run every minute (or even 2 or 4 times per minute) if there IS a file. So what I am invisioning is something like this.
I'd create the cron job to run every minute, which does the following:
Check if /var/import/import_products.csv exists
If it does not exist, exit the script now.
Execute product_import.php (this would be stored in the same folder as my cronjob file)
After execution completes, delete /var/import/import_products.csv
I will be using a separate daemon to generate these CSV files, all that is imperative here is that it only runs the PHP script if the file exists.
If there is a way to do this without using cron, that would work too. Basically I just want it to run that PHP script every time a new import_products.csv is written and delete the file afterwards. Cron jobs are the only way I know for sure to do this.

There are a couple of ways I could envision you doing this. The first is, if possible, the easiest, which would be to add in checks to the PHP script itself to see whether or not the file is present. If your product imports will take longer than one minute you'll also have to consider what happens if the file is still there while another import is happening already.
The second way would be to create a bash or shell script of some kind that will check for the existence of the file and then run the command to execute the PHP script if so, then add the shell script to your cron instead of the PHP script itself.

You can include the file checking within the php script through Exception handling, with a little php side overhead.

Related

What is the best way to run a continuous loop in php that only runs if it's not already running?

LAMP stack on Ubuntu 18.04:
I have a php loop that I want to continuously run. It is my understanding that Apache will spawn a thread per script that is executing and give one core to it and that's effectively what I'd like to do: Have a php script that constantly loops, processing things as it needs to.
I've seen methods of running a cron every minute that create a file and then delete it when it is done but what happens when the script takes 61 seconds to execute and then it's just sitting around not doing anything for 59 seconds. Or if the script crashes and doesn't delete the file.
Before doing this the wrong way, I wanted to find out what the right way is.
The "check if file exists" is an "ok" approach... You can replace it by a row in some database using the same logic.
You can avoid the timeout problem by checking if the file date (or database record date) is older than 1 minute. If it is, then the last process failed to delete the file and you can do it and start the process again.
To me it sounds like you want to reinvent event-driven model using apache and mod_php, which are not created for event-driven models.
If you want to stick with php, I would suggest just running php_cli script (not related to Apache) in the console (maybe as systemd service). When it starts it should verify that it is not started already. Or you can use a .lock or .pid file.

Deleted process continues creating a file

In my website I had a process which needs to create a file (executed with cronjob). However, I removed both the script and the cronjob, but the file is still created and the process from the cronjob executed. I know this sounds really unbelievable, but is there any way a process can stack in the memory of the server and loop? Can you think of any other reason causing this problem?
I had same issue. Process is executed infinity times and code changes, code removal does not help as it seems that process is "caching" your code.
What I have done:
Log in to SSH to server, use top command and search for PHP process PID, than use kill to terminate this process. (some source about it).
To prevent this:
I have created some file on server and inside every loop and just before starting function (I had recursive function) check if file exists (or check for valid content). If not found - do not execute process.

Prevent a job, called in code, to overlap with job running in background in Laravel

In Laravel, I'm running a job in my code like this
Artisan::queue('mediaserver:process-queue', ['index' => $indexName]);
Same job is scheduled to run in background each hour. Sometimes I want to trigger it via my UI (so via Code) to fasten things up. What I need to prevent is, that the process starts while it always is running in background.
What is the best working method for this?
Laravel's task scheduler handles overlapping jobs by creating a temp file before the command runs, and destroying it after.
The name of the file is based on the cron schedule expression and command name.
The existence of this file cause subsequent attempts to run the same command to be skipped
I would do something similar to this. At the top of your mediaserver:process-queue task I'd check if a temp file named something like mediaserver__process_queue existed in storage/. If it does, quit. If it doesn't create it. Destroy it when the task ends.
You would just have to be careful how you handle scenarios where the task quits unexpectedly from an uncaught exception, causing mediaserver__process_queue to remain even though the task is no longer running.

Setup cron job from php script to run another php script?

Is it possible to setup a cron job from within a PHP script on a LAMP server to run another PHP script? I'm a newb with cron jobs but I do know php quite well. Do I have to access the shell from PHP to set up the cron job? How would I code that if, for example, I want to execute execute.php automatically in 5 minutes after it is called from call.php?
Edit: Just to be clear, I only want to run execute.php once 5 minutes after it is called, and not have it repeat daily or anything like that. So, after it is executed by the cron job, the cron job should be discarded.
Cron doesn't work exactly like that, but you can set something up to create the functionality you want.
I would first set up a cron entry to execute execute.php every minute.
Then, when call.php is run, call.php makes an entry in a database table or flat file with the time that execute.php should be called.
When the cron entry is run, execute checks the database table or flat file to see if it's supposed to run the code in the file at that time, and if it is, runs it.
Use sleep at the beginning of execute.php
sleep(5*60);
//Rest of the code
It should by called like this:
require_once("execute.php");
However, call.php will not send any response for 5 minutes

Does a cron job kill last cron execution?

I have a cron job the executes a PHP script. The cron is setup to run every minute, this is done only for testing purposes. The PHP script it is executing is designed to convert videos uploaded to the server by users to a flash format (eg... .flv). The script executes fine when manually doing it via command line, however when executing via cron it starts fine but after one minute it just stops.
It seems that when the next cron is executed it "kills" the last cron execution.
I added the following PHP function:
ignore_user_abort(true);
In hopes that it would not abort the last execution, I tested setting the cron to run every 5 minutes, which works fine, however a conversion of a video may take over 5 minutes so I need to figure out why its stoping when another cron is executed.
Any help would be appreciated.
Thank you!
EDIT:
My cron looks like:
*/1 * * * * php /path_to_file/convert.php
I don't think cron kills any processes. However, cron isn't really suitable for long running processes. What may be happening here is that your script tramples all over itself when it is executed multiple times. For example, both PHP processes may be trying to write to the same file at the same time.
First, make sure you not only look in the php error log but also try to capture output from the PHP file itself. E.g:
*/1 * * * * * php /path/to/convert.php & >> /var/log/convert.log
You could also use a simplistic lockfile to ensure that convert.php isn't executed multiple times. Something like:
if (file_exists('/tmp/convert.lock')) {
exit();
}
touch('/tmp/convert.lock');
// convert here
unlink('/tmp/convert.lock');
cron itself won't stop a previous instance of a job running so, if there's a problem, there's almost certainly something in your PHP doing it. You'll need to post that code.
No, it will not. You can keep a second process from running by creating a lock file that the script checks for on each run. If the file exists, it does not run. This should also, if appropriate, be used in conjunction with a maximum execution time so that one process does not stall future executions indefinitely. The lock file can just be an empty plain text file called /tmp/foo.lock.

Categories