I have the following servers configured:
App Server is running LAMP with Laravel 5.1
Queue Server is running Beanstalkd and Supervisord
I want to be able to send jobs in Laravel on the App Server to the Queue Server which will simply run a DB insert. I was hoping to use Laravel's job queueing to do this but from my understanding it would then require me to have the same Laravel project on both hosts? It seems like it sends the class to execute, not the data itself. Perhaps I have a bad understanding of how Beanstalkd works? Should I be using something like RabbitMQ instead?
Any help would be greatly appreciated!
Related
I am working on a project which runs Laravel queues. We need to expose several metrics to the Prometheus server. Given Prometheus is a pull model, is there a way for Laravel queues to run a web server while running queues worker?
I know there is Prometheus push gateway, but looking for other alternatives
In this moment i have a shared hosting server.
In my app i want to use Laravel's queue system, but i can't maintain the command php artisan queue:work beacuse i can't install a supervisor.
With a little bit of an effort, i can move my app on a VPS, but i don't have much of experience with servers and i'm a little scared that my app will be offline much time.
Considering lack of experience on server side, i have this questions:
Is it ok to use Laravel queues with cron jobs? Can it break in any way?
Only for this problem should i upgrade to an VPS or i should remain on this shared hosting server ( i have ssh access here )?
Quick answer: you should not use Laravel queues without a process monitor such as Supervisor.
It all depends of what you want to achieve, but an alternative to queues would be using the laravel scheduler: you can trigger the scheduler with a cron task (every minute for example), and dispatch jobs easily.
And if you really want to use the queues, a solution could be to add your jobs to the queue, and process them using a cron task every minute running the following command: php artisan queue:work. But I would recommend the previous solution.
I am using Laravel forge with Redis as the queue driver.
I have updated the code for my application to send push notifications a few times over, but the notifications sent are as in the old code.
Changing the queue driver to database, sends the notifications as per the latest updates. However when I switched it back to Redis, it still shows old version of the notification.
I have done "FLUSHALL" via redis-cli, but it didn't fix it.
Also I use Laravel Horizon to manage queues.
How I can fix this? Thanks in advance.
Edit: Other thing I noticed was all code driven dispatches were queued on Redis. I have listed the solution in the answer in the hopes it would help someone else.
What I received from Forge support:
Hello,
There might be a worker that's stuck you can try and run php artisan
horizon:purge which should kill all rogue worker processes, and then
restart the daemon. It's advised to run the purge command in your
deployment script to make sure all stale processes are killed.
-- Mohamed Said forge#laravel.com
However this how I got it sorted:
php artisan horizon:terminate
php artisan queue:restart
And then the code was working properly
Stop redis, Stop Horizon workers. Start redis and then start horizon workers.
But before these all clear cache.
I had similar problem and in my case it was just the matter of restart all the services.
Currently I am able to integrate the ratchet web socket chat into my laravel app locally. Normally, I will execute this command -> 'artisan serve' to use the build in laravel server.
After I integrated the ratchet, I will need to manually run another php class which starts the ratchet server in order to perform real-time chat. So I wonder: is it possible to run both servers by using the artisan serve command? So that I won't need to manually run the ratchet server class?
Actually I already tried creating a chat service provider class to execute the ratchet server class during runtime. But when I use the command artisan serve to run, it only starts the ratchet server. It will not continue to start the laravel server, it is stuck at that point...
I have a PHP project (Symfony2) that uses RabbitMQ. I use its as simple message queue to delay some jobs (sending mails, important data from APIs). The consumers run on the webserver and their code is part of the webserver repo - they are deployed in the same with with the web.
The questions are:
How do I start the consumers as daemons and make sure they always run?
When deploying the app, how do I shut down consumers "gracefully" so that they stop consuming but finish processing the message they started?
If it's any important, for deployment I use Capifony.
Thank you!
It maybe worth looking at something supervisord which is written in python. I've used it before for running workers for Gearmand which is a job queue that fullfils a similar role to the way your using RabbitMQ.