I'm trying to use Redis for my queues.
Currently I'm on Homestead and I run php artisan queue:work --daemon --tries=3 in my virtual machine.
To test queues I write something in the log. When I use the sync driver, the logger can write, but it cannot when I use the redis one.
I also checked out the running processes and the redis-server is running, what's wrong?
Run redis-cli monitor and see if it shows anything being added when you push to the queue.
If nothing shows up, it means the queue isn't actually talking to redis.
Related
I'm stucked with a problem about Laravel queue worker using Elastic Beanstalk.
I'm using Github + CircleCI + Beanstalk in order to get my app deployed, all of this is working fine and with no problems, but the queue worker that is running on the instance is not pulling the available jobs on database (all the .env variables are well setted up, to use database for queues and so on).
I have configured a hook to restart queue worker and start it again on .ebextensions hooks, so this process is being run when the deploy finishes:
...
...
"/opt/elasticbeanstalk/hooks/appdeploy/post/99_restart_workers.sh":
mode: "000755"
owner: root
group: root
content: |
#!/usr/bin/env bash
php /var/app/current/artisan queue:restart
php /var/app/current/artisan queue:work --queue=default --tries=2 &
...
...
The very odd situation here, is that, if I ssh into the EB instance (I have only one), I see the worker running fine, but that worker is not pulling jobs from database that are available to be processed:
BUT, if I kill the worker process, and then manually run exactly the same command, it works perfect and pull all jobs:
Does anyone can figure out why this could be happening?
Thanks.
Since I've upgraded my Laravel application from 5.4 to 5.5 and added Laravel Horizon, my queue isn't working anymore. Here is the old situation which worked:
The driver I used was Beanstalkd and Supervisord for monitoring the task and keep it up and running. I ran it using this command:
php artisan queue:work --tries=1 --queue=high,medium,low
New situation: I've updated the queue driver to Redis. When I take a look at mydomain.com/horizon, I see the tasks coming in but not being processed. Running the following command from the terminal doesn't work either:
php artisan queue:work --tries=1 --queue=high,medium,low
I have 2 queues that are filled, the Redis queue and the Beanstalkd queue. How can I finish the Beanstalkd queue and then process the Horizon queue?
I figured out that the command artisan down will block the queue workers :) So after running artisan up everything works good... I think its time for the weekend :P
I'm using Beanstalkd as a work queue in my project.
Now, my project is completed and I have to deploy it on VPS(production server).
There is something that confusing me! should I ssh to production server and manually type php artisan queue:listen ? (it's crap)
Is there any server to run queue:listen as service?
You should use something like Supervisor to run the queue in production. This will allow you to run the process in the background, specify the number of workers you want processing queued jobs and restart the queue should the process fail.
As for the queue you choose to use, that's up to you. In the past I've used Beanstalkd locally installed on an instance and Amazon SQS. The local instance was fine for basic email sending and other async tasks, SQS was great for when the message volume was massive and needed to scale. There are other SaaS products too such as IronMQ, but the usual reason people run into issues in production are because they're not using Supervisor.
You can install Supervisor with apt-get. The following configuration is a good place to start:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /home/forge/app.com/artisan queue:work --sleep=3 --tries=3
autostart=true
autorestart=true
numprocs=8
stdout_logfile=/home/user/app.com/worker.log
This will do the following:
Give the queue worker a unique name
Run the php artisan queue:work command
Automatically start the queue worker on system restart and automatically restart the queue workers should they fail
Run the queue worker across eight processes (this can be increased or reduced depending on your needs)
Log any output to /home/user/app.com/worker.log
To start Supervisor, you'd run the following (after re-reading the configuration/restarting):
sudo supervisorctl start laravel-worker:*
The documentation gives you some more in-depth information about using Supervisor to run Laravel's queue processes.
I'm running Laravel 5.3. I'm attempting to test a queue job, and I have my queue configured to use Amazon SQS. My app is able to push a job onto the queue, and I can see the job in SQS. But it stays there, never being processed. I've tried running php artisan queue:work, queue:listen, queue:work sqs... None of them are popping the job off the queue. I'm testing this locally with Homestead. Is there a trick to processing jobs from SQS?
I faced the same problem. I was using supervisor. This worked for me:
Mentioned queue driver in command (sqs):
command=php /var/www/html/artisan queue:work sqs --tries=3
Then ran these commands:
sudo supervisorctl reread
sudo supervisorctl update
sudo supervisorctl restart all
Posting this just in case it helps anyone.
The instructions in the following post worked for me: https://blog.wplauncher.com/sqs-queue-in-laravel-not-working/. In essence, make sure you do the following:
create your standard queue in SQS
update your config/queue.php file to use your SQS credentials (I would suggest - - adding more env vars to your .env file and referencing them in this file)
update your QUEUE_DRIVER in your .env, so it's set to QUEUE_DRIVER=SQS
update your supervisor configuration file (typically /etc/supervisor/conf.d/laravel-worker.conf)
update and restart supervisor (see the 3 commands mentioned by #Dijkstra)
I have installed beanstaled and its working fine with laravel. The point where I am puzzled is that we have to do
php artisan queue:listen
to start listening queue. Right now, I am using it on amazone ec2 instance remotely through putty. but what is i close terminal? Will the jobs created through the code will work? Is it manually calling php artisan queue:listen or php artisan queue:work all time. Which does not seems fair.
If once php artisan queue:listen done, will it keep on running even if we close terminal?
Actually I dont know.
you need to install supervisor also. Here is a tutorial on using beanstalkd with laravel:
http://fideloper.com/ubuntu-beanstalkd-and-laravel4
Here are details on supervisor also:
http://supervisord.org/installing.html
I personally use a redis instance and run my queue with supervisor from there.
I find its a bit more memory effective then beanstalkd personally but each to there own.
Supervisor will execute the queue:listen command from artisan and this will run a job, if you have multiple supervisor processes then you can run multiple in line items.
depending on what you are doing i would almost look into python and multithereading also as i have used this for a few things i used to use a queue for and it has provided even better results.
example config file for supervisor:
[program:myqueue]
command=php artisan queue:listen --env=your_environment
directory=/path/to/laravel
stdout_logfile=/path/to/laravel/app/storage/logs/myqueue_supervisord.log
redirect_stderr=true
autostart=true
autorestart=true
You can also make use of Laravel's Task Scheduler i.e add the php artisan queue:listen command to the scheduler and sets its frequency to whatever you wants.
So that will make sure to call queue listen process automatically.
Hope it will make sense.