If I am running Beanstalk with Supervisor on a server with a Laravel 4 application, and I want it to process all queues asynchronously -- as many as it can at the same time -- can I have multiple listeners running at the same time? Will they be smart enough to not "take" the same to-do item from the queue, or will they all reach for the same one at the same time, and thus not work in the way I'm wanting? In short, I want to use Queues to process multiple tasks at a time -- can this be done?
php artisan queue:listen && php artisan queue:listen && php artisan queue:listen
In short, I want to use Queues to process multiple tasks at a time -- can this be done?
In short - yes it can be done. Every job taken by a worker is locked until it's release. It means that other workers will get different jobs to process.
IMO it's better to configure Supervisor to run multiple queue:work command. It will take only one job, process it and stop execution.
It's not encouraged to run PHP scripts in a infinite loop (as queue:listen does), because after some time they can have memory issues (leaks etc).
You can configure Supervisor to re-run finished workers.
Related
My Laravel queue setup has multiple instances that are horizontally scalable and run with the following command (corn):
php artisan queue:work --queue=high,default,low
Now, I need to add one more type superlow that has to be single-threaded. The queue workers should pick up this type of job only one at a time and prevent other workers from running it until it is finished.
php artisan queue:work --queue=high,default,low,superlow
The queue type is database
I couldn't find this kind of setup option in the official documentation
Is it possible to send a stop signal to the worker in such a way, that it will stop only AFTER processing the job.
Currently I have a job, that takes some time AND can't be interrupted, cause I have only one try/attempt.
Sometimes I need to stop workers to redeploy my code. Is there a way to stop Laravel's worker only after finishing current job and before starting a new one?
I'm using supervisor for restarting the queue workers.
Cause currently on each deploy I'm loosing 1 job and my client loses money :(
P.S.
This is NOT a duplicate of Laravel Artisan CLI safely stop daemon queue workers cause he was using Artisan CLI and I'm using supervisor.
autorestart=true in supervisor + php artisan queue:restart solves the issue.
There is a built-in feature for this:
php artisan queue:restart
This command will instruct all queue workers to gracefully "die" after they finish processing their current job so that no existing jobs are lost. Since the queue workers will die when the queue:restart command is executed, you should be running a process manager such as Supervisor to automatically restart the queue workers.
Supervisord has an XML-RPC Api which you could use from your php code. I sugesst you use Zend's XML-RPC Client
I need to know if there is a way to use the internal laravel api to force the release of all queued jobs. The reason is that we have a queue implementation and there we have a mechanism that releases the execution of a job 5 minutes, if there was a problem during the job execution. The problem is that is required to have some sort of refresh feature that triggers all of those "delayed" jobs manually, since we need a bit of control of when to run those delayed jobs, keeping the fail-safe mechanism intact. There is some way to implement this using Laravel??
You can run the php artisan queue:work commando to start the Queue Work. If you wish start this from the code, you can call this command programmatically
In laravel you can start a queue listener with:
php artisan queue:listen
But how many workers (threads, processes) will be used to process the queue?
Is there any way to define the number of workers?
https://laravel.com/docs/queues#supervisor-configuration
You generate a config file where you define the number of workers.
numprocs=10
By running php artisan queue:listen only one process will be run and fetches the jobs from the queue. So the jobs will be fetched and processed one by one.
If you want to have more than one thread to process the queue jobs you need to run the listener many times in different consoles. But instead of running them manually you can use Supervisor to manage your threads then you will be able to configure the number of thread by setting numprocs parameter in Supervisor configuration setting
I'm having a bit of trouble grasping the concept of Queues in laravel 5.0. From what I understand, queues store a list of Commands to be executed by either by the php artisan queue:listen command or the php artisan queue:work --daemon command.
Correct me if I'm wrong, but the php artisan queue:listen just waits until there is a command in queue and then executes it, right? Then what does the php artisan queue:work --daemon command do in comparison? Does this command only work one at a time?
Anyways, a task I wish to complete is this... I want to periodically check if there are commands in the queue and if there are I wish to execute them. Since this is a periodic problem I assume a chron would be used, but how do i check if there are outstanding commands to be? Would I run a query on my queue table? If I do, how would I dispatch the command? Or should I just chron the php artisan queue:listen command
The main difference between queue:listen and queue:work is that the second one only sees if there's a job waiting, takes it and handles it. That's it.
The listener though runs as a background process checking for jobs available all the time. If there's a new job it is then handled (the process is slightly more difficult, but that's the main idea).
So basically if you need to handle your commands (jobs) as soon as they appear you might wanna use the queues.
And if you need to do something periodically (for example every 2 hours or every Monday at 9 AM etc.) you should go with cron + Schedule.
I wouldn't recommend combining them as you described. Also keep in mind that jobs can be delayed if you don't want then do be handled ASAP.