In laravel you can start a queue listener with:
php artisan queue:listen
But how many workers (threads, processes) will be used to process the queue?
Is there any way to define the number of workers?
https://laravel.com/docs/queues#supervisor-configuration
You generate a config file where you define the number of workers.
numprocs=10
By running php artisan queue:listen only one process will be run and fetches the jobs from the queue. So the jobs will be fetched and processed one by one.
If you want to have more than one thread to process the queue jobs you need to run the listener many times in different consoles. But instead of running them manually you can use Supervisor to manage your threads then you will be able to configure the number of thread by setting numprocs parameter in Supervisor configuration setting
Related
I am making an api that requires the job to be dispatched multiple times, however, each job takes 10 seconds, and it takes forever to process one by one. Is their anyway to run multiple job once?
GetCaptcha::dispatch($task_id)->afterCommit()->onQueue('default');
You can achieve that by running multiple workers at the same time.
From the Laravel docs:
To assign multiple workers to a queue and process jobs concurrently,
you should simply start multiple queue:work processes. This can either
be done locally via multiple tabs in your terminal or in production
using your process manager's configuration settings. When using
Supervisor, you may use the numprocs configuration value.
Read more here:
https://laravel.com/docs/9.x/queues#running-multiple-queue-workers
https://laravel.com/docs/9.x/queues#supervisor-configuration
My Laravel queue setup has multiple instances that are horizontally scalable and run with the following command (corn):
php artisan queue:work --queue=high,default,low
Now, I need to add one more type superlow that has to be single-threaded. The queue workers should pick up this type of job only one at a time and prevent other workers from running it until it is finished.
php artisan queue:work --queue=high,default,low,superlow
The queue type is database
I couldn't find this kind of setup option in the official documentation
I have a lot of tasks, which should be processed in proper order. I would like to divide them by "type" to different queues. Queues should be created dynamically. Then I would like to start X workers, which will process the tasks from the queues. But there is one important rule – from each queue, only one task should be processed at once. The reason is – each task of the specified type can change the state of the application, so I can't start processing a new task of the same type if the last one hasn't finished.
I would like to use Laravel queue system with Redis driver, but I'm not sure it's able to do that. How to prevent the queue system from taking more than one job from each queue at once? Any ideas?
Thank you in advance for your help.
If you are using supervisor then this is what you can do in your .conf file.
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/****/artisan queue:work
autostart=true
autorestart=true
user=root
numprocs=1 <----- this is what you are looking for
redirect_stderr=true
stdout_logfile=/var/www/html/****/storage/logs/supervisord.log
numprocs directive will instruct Supervisor to run 1 queue:work processes and monitor it, automatically restarting it if it fails. (Laravel Queue Supervisor Doc)
Job chaining allows you to specify a list of queued jobs that should be run in sequence after the primary job has executed successfully. If one job in the sequence fails, the rest of the jobs will not be run. To execute a queued job chain, you may use the withChain method on any of your dispatchable jobs.
If you would like to specify the default connection and queue that should be used for the chained jobs, you may use the allOnConnection and allOnQueue methods.
ProcessPodcast::withChain([
new OptimizePodcast,
new ReleasePodcast
])->dispatch()->allOnConnection('redis')->allOnQueue('podcasts');
See Laravel docs for more info.
I currently have a multi-server Laravel setup. I have multiple servers that are load balanced and share a database.
Each instance is also running a queue listener. I want to be able to dispatch two types of jobs:
A job that is only run once (e.g send email, update a model, etc)
A job that is run on ALL queue listeners (e.g delete a file from the filesystem)
The first I think is quite simple to implement, but unsure how to go about implementing the second one. Any ideas?
You can make a queue listener or worker handle specific queues. For example run a queue listener or worker to handle emails queue.
php artisan queue:listen --queue=emails
php artisan queue:work --queue=emails
You can now dispatch jobs to a this queue
dispatch((new Job)->onQueue('emails'));
This can help you setup multiple listeners/workers to handle different queues based on your requirements.
If I am running Beanstalk with Supervisor on a server with a Laravel 4 application, and I want it to process all queues asynchronously -- as many as it can at the same time -- can I have multiple listeners running at the same time? Will they be smart enough to not "take" the same to-do item from the queue, or will they all reach for the same one at the same time, and thus not work in the way I'm wanting? In short, I want to use Queues to process multiple tasks at a time -- can this be done?
php artisan queue:listen && php artisan queue:listen && php artisan queue:listen
In short, I want to use Queues to process multiple tasks at a time -- can this be done?
In short - yes it can be done. Every job taken by a worker is locked until it's release. It means that other workers will get different jobs to process.
IMO it's better to configure Supervisor to run multiple queue:work command. It will take only one job, process it and stop execution.
It's not encouraged to run PHP scripts in a infinite loop (as queue:listen does), because after some time they can have memory issues (leaks etc).
You can configure Supervisor to re-run finished workers.