Laravel 5.2 Rednudant (Multiple) Queue Workers - php

I have multiple queue workers on my Laravel 5.2 project. I am running on AWS. I am using ECS. I am using Redis for my queue driver.
I would like to know..
If I have 2 servers working the same queue
php /var/www/laravel/artisan queue:listen --env=production --timeout=30 --tries=1 --queue=mail
Will they both process the job, thus it gets processed twice? Or will it only get processed once and this will help with the load/redundancy?
Many Thanks in advance!

A job only exists on a queue once, as soon as a worker grabs the job, it is removed from the queue.
So as long as the different workers are accessing the same instance of the same queue, the jobs will only be executed once.

Related

Laravel 5.6. Stop a worker AFTER job execution with supervisor

Is it possible to send a stop signal to the worker in such a way, that it will stop only AFTER processing the job.
Currently I have a job, that takes some time AND can't be interrupted, cause I have only one try/attempt.
Sometimes I need to stop workers to redeploy my code. Is there a way to stop Laravel's worker only after finishing current job and before starting a new one?
I'm using supervisor for restarting the queue workers.
Cause currently on each deploy I'm loosing 1 job and my client loses money :(
P.S.
This is NOT a duplicate of Laravel Artisan CLI safely stop daemon queue workers cause he was using Artisan CLI and I'm using supervisor.
autorestart=true in supervisor + php artisan queue:restart solves the issue.
There is a built-in feature for this:
php artisan queue:restart
This command will instruct all queue workers to gracefully "die" after they finish processing their current job so that no existing jobs are lost. Since the queue workers will die when the queue:restart command is executed, you should be running a process manager such as Supervisor to automatically restart the queue workers.
Supervisord has an XML-RPC Api which you could use from your php code. I sugesst you use Zend's XML-RPC Client

Multiple jobs are processing with one worker

I have an app built on laravel 5.1 which is using RabbitMQ for job processing. The worker command is as follows:
php artisan queue:work --queue=[queue-name] --daemon --tries=5
The issue which I am experiencing is that multiple jobs are getting processed at the same time even though I have only one worker for the queue. How and why is this happening? I have a table which tracks the status of the jobs and as per the table, the number of parallel processing jobs sometimes increase to 8 or 9. With one worker, this count is strange.
Note:
My application has 3500-4000 jobs for processing daily
The jobs are time consuming so dont know if this has something to do with the process timeout

Laravel queue listeners on multiple machines on the same queue

I currently have a multi-server Laravel setup. I have multiple servers that are load balanced and share a database.
Each instance is also running a queue listener. I want to be able to dispatch two types of jobs:
A job that is only run once (e.g send email, update a model, etc)
A job that is run on ALL queue listeners (e.g delete a file from the filesystem)
The first I think is quite simple to implement, but unsure how to go about implementing the second one. Any ideas?
You can make a queue listener or worker handle specific queues. For example run a queue listener or worker to handle emails queue.
php artisan queue:listen --queue=emails
php artisan queue:work --queue=emails
You can now dispatch jobs to a this queue
dispatch((new Job)->onQueue('emails'));
This can help you setup multiple listeners/workers to handle different queues based on your requirements.

Laravel how to set or define number of workers?

In laravel you can start a queue listener with:
php artisan queue:listen
But how many workers (threads, processes) will be used to process the queue?
Is there any way to define the number of workers?
https://laravel.com/docs/queues#supervisor-configuration
You generate a config file where you define the number of workers.
numprocs=10
By running php artisan queue:listen only one process will be run and fetches the jobs from the queue. So the jobs will be fetched and processed one by one.
If you want to have more than one thread to process the queue jobs you need to run the listener many times in different consoles. But instead of running them manually you can use Supervisor to manage your threads then you will be able to configure the number of thread by setting numprocs parameter in Supervisor configuration setting

Laravel 4: Queues and Multiple Listeners

If I am running Beanstalk with Supervisor on a server with a Laravel 4 application, and I want it to process all queues asynchronously -- as many as it can at the same time -- can I have multiple listeners running at the same time? Will they be smart enough to not "take" the same to-do item from the queue, or will they all reach for the same one at the same time, and thus not work in the way I'm wanting? In short, I want to use Queues to process multiple tasks at a time -- can this be done?
php artisan queue:listen && php artisan queue:listen && php artisan queue:listen
In short, I want to use Queues to process multiple tasks at a time -- can this be done?
In short - yes it can be done. Every job taken by a worker is locked until it's release. It means that other workers will get different jobs to process.
IMO it's better to configure Supervisor to run multiple queue:work command. It will take only one job, process it and stop execution.
It's not encouraged to run PHP scripts in a infinite loop (as queue:listen does), because after some time they can have memory issues (leaks etc).
You can configure Supervisor to re-run finished workers.

Categories