I have a table users that I need to upload continuously, so once updated, I would like to relaunch the command directly.
Actually, I'm using a cron that launch itself each minute with Laravel ($schedule->command('update:users')->everyMinute();), but I'm losing some time if the job is quicker than one minute of I will overload my server if it is more than one minute.
I was thinking to maybe use a queue, and once the script terminated, relauch itself, like this:
// Do My stuff
Queue::push(new UpdateUsers($));
But if the script crash, it will not reload itself, and I need to launch it at least once. I know that I could use a pcntl_fork function, but I would like to have a turnkey function with Laravel. How should I do ?
I would suggest running a Command the Cli,
in the command place a
while (true)
loop so it will run forever. After you created this script you can run it with supervisord
this service runs the command you tell him, and when it fails it will relaunch it automaticlly. Just be aware that after X failures it will stop, it depends on how you configured it.
Example for conf file in:
/etc/supervisord/conf.d/my_infinite_script.conf
and contents could be:
[program:laravel_queue]
command=php artisan your_command:here
directory=/path/to/laravel
autostart=true
autorestart=true
stderr_logfile=/var/log/your_command.err.log
stdout_logfile=/var/log/your_command.out.log
I've used the approach suggested by Tzook Bar Noy in some cases, but have also used a slightly uglier method which can avoid issues with having scripts looping forever if this might cause problems. This can be called every minute in a cronjob:
$runForSeconds = 55;
$runMinute = date('i');
do {
....code....
} while (date('i') == $runMinute && intval(date('s')) < $runForSeconds);
But the best solution would be to use a Jobs queue and run that using supervisor and:
command=php artisan queue:listen
Related
I am facing a serious problem in Laravel Queue system please help me to fix this issue.
Once I queue my mail by using
$mailer = Mail::to($email_to)->queue(new ContactGeneral($data));
it stores into the database and runs this command from terminal php artisan queue:listen it works fine once I close my terminal it does not listen to my queue.
For that, I set up a scheduled in kernem.php file like that which run in every minute
protected function schedule(Schedule $schedule){
$schedule->command('queue:listen')->everyMinute();
}
set this line in a cronjob and work fine
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
Problem is as runs every minute run every minute it not kill the previous process and run another process in next minute it slowdown my server
Please can you let me know what is the best way to implement this
Thanks in advance
Best way is to use supervisor. Though if you are running the application in a shared hosting environment, you can process the queues once and then exit the process thus freeing up the memory by using the following command:
php artisan queue:work --once
Depending on how many queues you'll have, set the queue to run once every 1, 2 or 3 minutes to make sure the previous process has time to consume the queues and they won't interfere often. I think you can use the following command:
* * * * * cd /path-to-your-project && php artisan queue:work --once
No, you don't need to schedule this process
as long as queue:work process is running he will look at your "jobs" table and run task by task
what you need is something to make sure that the process doesn't end when you close the console, as user8555937 and Webinion said you need supervisor and its configuration file, once you run it it will run in the background and you can forget about it
I am currently following a tutorial that teaches how to create a queue in php. An infinite loop was created in a php script. I simplified the code in order to focus on the question at hand:
while(1) {
echo 'no jobs to do - waiting...', PHP_EOL;
sleep(10);
}
I use PuTTy (with an SSH connection) to connect to the linux terminal in my shared hosting account (godaddy). If I run php queuefile.php, I know it will run with no problems (already tested the code with a finite for loop instead of the infinite while loop).
QUESTION: How could I exit out of the infinite loop once it has started? I have already read online the option of creating code that "checks" if it should continue looping with something like the following code:
$bool = TRUE;
while ($bool)
{
if(!file_exists(allow.txt)){$bool = FALSE}
//... the rest of the code
though I am curious if there might be a command I can type in the terminal, or a set of keys I can push that will cause the script to terminate. If there is any way of terminating the script, or if there is a better way to make the previous "check", I would love your feedback!
Pushing Ctrl+C should stop the running program that is running in the foreground.
You could also kill it when you login in another session an do some ps aux | grep my-php-script.php and see if it is your program, then you can use pkill -f my-php-script.php to kill this process.
I understand so that you want to make cron in your server. Therefore you should log in your server via putty and create cron job.
For example:
After logging...
crontab -e
Then add
1 2 3 4 5 /path/to/command arg1 arg2
I have a php-cli script that is run by cron every 5 minutes. Because this interval is short, multiple processes are run at the same time. That's not what I want, since this script has to write inside a text file a numeric id that is incremented each time. It happens that writers are writing at the same time on this text file, and the value written is incorrect.
I have tried to use php's flock function to block writing in the file, when another process is writing on it but it doesnt work.
$fw = fopen($path, 'r+');
if (flock($fw, LOCK_EX)) {
ftruncate($fw, 0);
fwrite($fw, $latestid);
fflush($fw);
flock($fw, LOCK_UN);
}
fclose($fw);
So I suppose that the solution to this is create a bash script that verifies if there is an instance of this php script that is running, if so it should wait until it finished. But I dont know how to do it, any ideas?
The solution I'm using with a bash script is this:
exec 9>/path/to/lock/file
if ! flock -n 9 ; then
echo "another instance is running";
exit 1
fi
# this now runs under the lock until 9 is closed (it will be closed automatically when the script ends)
A file descriptor 9> is created in /var/lock/file, and flock will exit a new process that's trying to run, unless there is no other instance of the script that is running.
How can I ensure that only one instance of a script is running at a time (mutual exclusion)?
I don't really understand how incrementing a counter every 5 minutes will result in multiple processes trying to write the counter file at the same time, but...
A much simpler approach is to use a simple locking mechanism similar to the below:
<?php
$lock_filename = 'nobodyshouldincrementthecounterwhenthisfileishere';
if(file_exists($lock_filename)) {
return;
}
touch($lock_filename);
// your stuff...
unlink($lock_filename);
This as a simple approach will not deal with a situation when the script breaks before it could remove the lock file, in which case it would never run again until it is removed.
More sophisticated approaches are also possible as you suggest, e.g. fork the job in its own process, write the PID into a file, then before running the job it could be checked whether that PID is still running.
To prevent start of a next session of any program until the previous session still running, such as next cron job, I recommend to use either built into your program or external check of running process of this program. Just execute before starting of your program
ps -ef|grep <process_name>|grep -v grep|wc -l
and check, if its result will be 0. Only in this case your program could be started.
I suppose, that you must guarantee an absence of 3rd party process having similar name. (For this purpose give your program a longer and unique name). And a name of your program must not contain pattern "grep".
This work good in combination with normal regular starting of your program, that is configured in a cron table, by cron daemon.
For the case if your check is written as an external script, an entry in the crontab might look like
<time_specification> <your_starter_script> <your_program> ...
2 important remarks: Exit code of your_starter_script must be 0 in case of not starting of your program and it would be better to completely prohibit writing to stdout or stderr by this script.
Such starter is very short and a simple programming exercise. Therefore I don't feel a need to provide its complete code.
Instead of using cron to run your script every 5 minutes, how about using at to schedule your script to run again, 5 minutes after it finishes. Near the end of your script, you can use shell_exec() to run an at command to schedule your script to run again in 5 minutes, like so:
at now + 5 minutes /path/to/script
Or, perhaps even simpler than my previous answer (using at to schedule the script to run again in 5 minutes) is make your script a daemon, by using a non-terminating loop, like so:
while(1) {
// whatever your script does here....
sleep(300) //wait 5 minutes
}
Then, you can do away with scheduling by way of cron or at altogether. Just simply run your script in the background from the command line, like so:
/path/to/your/script &
Or, add /path/to/your/script in /etc/rc.local to make your script start automatically when the machine boots.
I have 3 scripts that do some stuff.
I want to run them continously and concurrently.
Let's say for example:
First script took 30 minutes to finish.
Second - 20 mins.
Third - 5 mins.
So I need everyone of them to run immediately after it's finished.
The 3 scripts make UPDATE in a same DB, but they need to work separately.
They can run together at once, but not couple of times(my english sucks, sorry about that).
Let's explain what I mean with example:
firstScript.php is running
secondScript.php is running
thirdScript.php is running
firstScript.php trying to start but it still running. Wait.(till finish)
May be some shell script will do the job, but how?
Make a bash script that takes one argument, and have it do something like this:
if [ -f /tmp/$1 ]
then
echo "Script already running"
else
touch /tmp/$1
php $1
rm /tmp/$1
fi
Set up a cron to run this script and pass it the name of the php script you want to run.
You could execute a shell command just before the php script dies. Example :
while($i < 1000)
{
$i++;
}
shell_exec("bin/php.exe some_script.php");
If you are working on a shared hosting account this might not work do to security issues.
note the "bin/php.exe" need to be edited for your server, point to where ever your php is installed.
I'm working on a wamp development environment and testing how long it takes to get the site indexed. I'm doing this by running cron manually.
The problem is that if there's 700 jobs in the job_queue, each run of cron does only some of them, so I need to run cron several times. How could I keep calling cron in a loop until there are no more jobs left in the job_queue?
Also, I'm open to any drush alternatives. I'm aware of drush cron, but this also does only some of the jobs each run, so needs to be run again manually.
If you want to run something all at once until it's done, cron is the wrong tool for the job; batch API is the tool for that. Unfortunately the search module is written to update only on cron, but there's not a lot of code in search_cron to copy into a batch function. So I'd suggest going the batch route rather wrapping some sort of pseudo-batch around cron, as your end goal doesn't seem to involve cron at all.
step 1. - Run cron every minute
step 2. - Just check if the script is already running. If so - close the "new script"
$lid='';
if(is_array($argv) && isset($argv[1])) {
$lid=$argv[1];
}
if($lid!=='') {
$xx=array();
exec('ps x | grep "thefile.php '.$lid.'" | grep -v "grep"', &$xx);
if(count($xx)>1) {
die('Script '.$lid.' running already... exiting... '."\n");
}
}
Put it in cron with every minute php thefile.php 1
php thefile.php 2
php thefile.php 3 to run 3 scripts at the same time.
drush search-index will generate the remaining search index for you.