Should php artisan queue:work output all processed jobs? - php

I'm using redis queue driver in Laravel 6.14. Today I've realized, that when I'm using php artisan queue:work it outputs processed jobs totally randomly. Does anybody come up with the same problem? I thought it outputs all jobs (correctly processed and failed). Thans for any ideas and help.

Related

Laravel Run 100 Tasks in a single queue simultaneously

There is a case where I need to run 100 jobs simultaneously . I did my R&D and I came to a point that i need to create multiple queues to run multiple jobs at a time. But I have dynamic number of job count which I need to execute in a given time.
Solution #1 (Create multiple job queues)
php artisan queue:work --queue=myJobQueue & php artisan queue:work --queue=myJobQueue1 & php artisan queue:work --queue=myJobQueue2
Can anyone please suggest the best approach to do this task ?

Laravel how to stop/restart workers of a specific queue?

I know you can stop all workers in Laravel using queue:restart. But I'm looking for a way to stop all workers working on a specific queue. Something like this:
php artisan queue:restart --queue=my_queue
As far as I read the documentation,
php artisan queue:clear redis --queue=emails
is only available for The SQS, Redis, and database queue drivers.
But this command is clearing, not stopping.

How to run queue:work command in laravel task scheduler?

I have been searching a lot for a solution until I just gave up...
I want to run php artisan queue:work --stop-when-empty command every minute in Laravel task scheduler.
I have tried this
$schedule->command('queue:work --stop-when-empty')->everyMinute()->runInBackground();
but that doesn't seem to work at all...
You are not supposed to run the queue in the scheduler.
The queue should always be up and running (using a process manager, like Supervisor) and pick jobs when they are dispatched (dispatched in a scheduled task or somewhere else, it doesn't matter).
Here is the documentation on this topic: https://laravel.com/docs/8.x/queues#supervisor-configuration

Running an artisan command forever with laravel forge?

Can someone possibly advise how I can keep my custom artisan command running forever with the daemon?
I saw the many tutorials with queues, however it doesn't exactly fit. I am trying to accomplish "subscribe" with pubnub's php library and this seems like the best way, unless I missed something?
Thanks in advance!
If you run the artisan command from the command line - it can already run indefinitely/forever. You dont need to do anything.
I have an application that has been running the one single artisan command for 97 days straight at the moment.
You then need to make sure it has not crashed for some reason, with something like Supervisor, or a web monitoring service like Eyewitness.io
This will help you run artisan command forever
nohup php artisan yourcommand:abc > mylog.log 2>&1 & echo $! >> save_pid.txt
when you want to kill this process, get pid from save_pid.txt file
kill pid

(Laravel 3.2) Artisan task in Cron Job not running?

I am trying to have this Artisan task run as a cron job (host is Bluehost):
php-cli /home3/***/***/artisan task
this works from the command line (SSH), but not with Cron.
I know it doesn't execute because it's supposed to add a DB entry.
What is wrong there?
EDIT::As there are no errors reported anywhere (either to email or to an output.log file),
I would guess the command executes, but fails to do anything.
Can that be because of a database connection issue in the Artisan Task?
There is a simple DB::('table)insert(etc) in there..
But if the task works from the command-line, why not from Cron?
I ended up making a php page in a sub directory (something like : /tasks_to_execute/task.php) with the php script on it with "echo 'task successful!'" at the end, and the cronjob on bluehost just "lynx" to it.
Soo no I didn't figure out why it didn't work as an artisan task. I just figured a way around it, quick fix. The security issue of having the script as a "public" page doesn't really matter as the script executes every half hour anyways and just updates a database from a facebook feed.
I faced similar problem. Try using with the full path to the php this: /usr/local/bin/php /home/mysitename/laravel/artisan schedule:run

Categories