Laravel queue listener times out - php

On my Linux server I have the following cron:
* * * * * php /var/www/core/v1/general-api/artisan schedule:run >> /dev/null 2>&1
The CRON works correctly. I have a scheduled command defined in my Kernel.php as such:
protected function schedule(Schedule $schedule)
{
$schedule->command('pickup:save')
->dailyAt('01:00');
$schedule->command('queue:restart')->hourly();
}
The scheduled task at 1AM runs my custom command php artisan pickup:save. The only thing this command does is dispatch a Job I have defined:
public function handle()
{
$job = (new SaveDailyPropertyPickup());
dispatch($job);
}
So this job is dispatched and since I am using the database driver for my Queues, a new row is inserted into the jobs table.
Everything works perfectly up to here.
Since I need a queue listener to process the queue and since this queue listener has to run basically forever, I start the queue listener like this:
nohup php artisan queue:listen --tries=3 &
This will write all the logs from nohup to a file called nohup.out in my /home directory
What happens is this: The first time, queue is processed and the code defined in the handle function of my SaveDailyPropertyPickup job is executed.
AFTER it is executed once, my queue listener just exits. When I check the logs nohup.out, I can see the following error:
In Process.php line 1335:
The process "'/usr/bin/php7.1' 'artisan' queue:work '' --once --queue='default'
--delay=0 --memory=128 --sleep=3 --tries=3" exceeded the timeout of 60 seconds.
I checked this answer and it says to specify timeout as 0 when I start the queue listener but there are also answers not recommending this approach. I haven't tried it so I dont know if it will work in my situation.
Any recommendations for my current situation?
The Laravel version is 5.4
Thanks

Call it with timeout parameter, figure out how long your job takes and scale from there.
nohup php artisan queue:listen --tries=3 --timeout=600
In your config you need to update retry after, it has to be larger than timeout, to avoid the same job running at the same time. Assuming you use beanstalkd.
'beanstalkd' => [
...
'retry_after' => 630,
...
],
In more professional settings, i often end up doing a queue for short running jobs and one for long running operations.

Related

Laravel jobs (database) do not execute handle

I have a problem with laravel jobs.
I configured laravel jobs to work with the database and it is working.
When I execute a job, the entry is created in database and the constructor is well executed.
However, the handle function is never executed ... and the jobs stay in the jobs table.
Someone already had this problem?
(I use Laravel 5.7).
I found the problem...
I'm using a different queue name that the default and in config/queue.php, in the database array you have the default queue name set to "default".
So when i execute : php artisan queue:work , he is waiting for default queue.
When i execute the command line : php artisan queue:work --queue QUEUENAME it is working !
Thanks everybody.
You should listen to the queue for default
php artisan queue:work
or
php artisan queue:work --sleep=1 --tries=5 --timeout=60
If you are not using the default queue then mention the custom queue
php artisan queue:work --sleep=1 --tries=5 --timeout=60 --queue customQueue

Laravel - Schedule queues returns a 503 error on shared hosting

I have application in laravel that send several email and some of this email have to wait some time to be sent.
So I'm using the queue database type and in localhost I run the command php artisan schedule:run that runs this command:
$schedule->command('queue:work')->everyMinute();
and works perfectly.
Now I pass the project to a cpanel shared hosting and to run the schedule command I create a cron job that do that.
/usr/local/bin/php /path to project/artisan schedule:run
As I need to be always watching if I need to send an email I define run a cron job each minute and works in first 5 or 10 minutes.
Next I start to receive a 503 error from server because I arrive to the lime of processes probably because the cron job execution. And right now the server will be down for 24hours.
How can I solve that? What is the better solution for this?
Thank you
I use shared hosting and had a similar issue. If your hosting service accepts the php command shell_exec() you could do this.
protected function schedule(Schedule $schedule)
{
if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
{
$schedule->command('queue:work --timeout=60 --tries=1')->everyMinute();
}
}
Your cron job seems ok. By the way, if your hosting server is 24h down, you may consider another host my friend.
queue:work is a long running process. This check ensures it's running on your server. It will listens to your queue and does the job. It also means that if you make changes to your production files, the worker will not pick the changes up. Have a look at my top -ac
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
2398733 user 20 0 466m 33m 12m S 0.0 0.1 0:03.15 /opt/alt/php72/usr/bin/php artisan queue:work --timeout=60 --tries=1
2397359 user 20 0 464m 33m 12m S 0.0 0.1 0:03.04 /usr/local/bin/php /home/user/booklet/artisan schedule:run
2398732 user 20 0 105m 1308 1136 S 0.0 0.0 0:00.00 sh -c '/opt/alt/php72/usr/bin/php' 'artisan' queue:work --timeout=60 --tries=1 >> '/home/user/booklet/storage/queue.log' 2>&1
As you can see, the worker is on top, another process simply writes everything it does to a log file. You have to kill 2398733 after making new uploads/changes to your prod server. The process will restart by itself in less than 5 minutes. Because of the schedule:run cron job.
Update October 2019
protected function schedule(Schedule $schedule)
{
if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
{
$schedule->command('queue:work --timeout=60 --tries=1')->withoutOverlapping();
}
}
The ->withoutOverlapping() method pushes the process command in the background. It ensures that the artisan Schedule command exits properly.
You can prevent this from happening with withoutOverlapping on the cron task.
By default, scheduled tasks will be run even if the previous instance of the task is still running. To prevent this, you may use the withoutOverlapping method:
$schedule->command('emails:send')->withoutOverlapping();
https://laravel.com/docs/5.7/scheduling#preventing-task-overlaps
This way, your cron will restart the queue:work task if it fails for some reason, but it won't fire up multiple instances of it.

Multiple Queues in Laravel

I am creating a web application in laravel in which bidding is being done by users in multiple games. Bidding is being performed by front end users and by cron job as well. Cron job do bid on each game after each second. Therefore some collision was happening between bids when same row was accessed at same time. To resolve concurrency issue I decided to use laravel queues for bidding. But I am having multiple games and therefore I want simultaneously bids of each game. I don't want bids of same game to be process at same time because then concurrency issue can be occur. I want to know about multiple queues system in laravel. After having search on internet I got to know about multiple queues like
php artisan queue:work --queue=myJobQueue, myJobQueue1, myJobQueue2,..myJobQueue7
But I am not sure how it works. Please someone explain me in detail that all 7 queues work simultaneously or one by one.
php artisan queue:work --queue=myJobQueue, myJobQueue1, myJobQueue2,..myJobQueue7 sets the priority in which queues will be executed. So with this all jobs on myJobQueue will be executed before moving to execute jobs on myJobQueue1 then to myJobQueue2 in that order.
However if you want jobs on these queues to be executed simultaneously, you could run each queue in the background.
php artisan queue:work --queue=myJobQueue & php artisan queue:work --queue=myJobQueue1 & php artisan queue:work --queue=myJobQueue2 &
This will run each queue as single processes in the background.
Are looking for the queue:listen command?
queue:work will process all pending jobs that are stored by the queue driver, whereas queue:listen will be waiting for jobs to be thrown at it to execute them as they come.
If you do php artisan queue:listen --queue=myJobQueue, myJobQueue1, myJobQueue2,..myJobQueue7, 7 queues are being created and listening to new tasks on their own.
In your code, you can dispatch jobs like the following:
dispatch((new MyJob)->onQueue('myJobQueue'));
You might want to use a tool like Supervisor to make sure queue:listen is always running in the background.
Hope this helps!
Like Ben V said, it is highly recommended to use Supervisor to keep the workers active at all times, especially if you want to run one or more workers per queue, or if you want the queues to be processed simultaneously.
Here is an example Supervisor configuration file:
[program:laravel-worker-myJobQueue]
process_name=%(program_name)s_%(process_num)s
command=php artisan queue:work --queue=myJobQueue
numprocs=8
autostart=true
autorestart=true
[program:laravel-worker-myJobQueue1]
process_name=%(program_name)s_%(process_num)s
command=php artisan queue:work --queue=myJobQueue1
numprocs=1
autostart=true
autorestart=true
The above configuration creates 8 workers for myJobQueue, and one worker for myJobQueue1, since multiple workers can help speed things up, but can cause trouble for jobs that try to access the same row in the database, in which case you want to limit things to 1 worker only.
You then simply dispatch jobs to the correct queue using
dispatch((new MyJob)->onQueue('myJobQueue'));
or
dispatch((new MyJob)->onQueue('myJobQueue1'));
This might be old but just in case, all of the answers are on point, but you must set the .env variable QUEUE_CONNECTION to something else than sync,if your configuration is set to sync it will take every job in order of entry to the queue (thus finishing one and then starting the next one), if it's set to database or redis it will be taking jobs in parallel if needed (which is the idea of setting priorities) you should check this article (it helped me) https://medium.com/hexavara-tech/optimize-laravel-with-redis-caching-made-easy-bf486bf4c58 also you will need to configure your queues in config/queue.php like such for example in the 'connections' array:
'database' => [
'driver' => 'database',
'table' => 'jobs',
'queue' => ['default','another_queue'], //this is just 'default' by default
'retry_after' => 90,
],
this applies for all the other configurations in this file.
If you are testing on a local machine you can create a .bat file inside your project and enter these lines in that bat file
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
start "" php artisan queue:work --queue=low
one line represents one queue at a time 10 means 10 queues will run at once.
also, I included --queue=low cause I have a low queue.
this is for local machines only for the online production checkout Supervisor.
In addition to the other answers here, you can also dispatch a job on a specified queue in Laravel like so:
MyJob::dispatch()->onQueue('myJobQueue');
Or, within the Job constructor:
public function __construct()
{
$this->onQueue('myJobQueue');
}
Remember to start your queue from the terminal:
php artisan queue:listen --queue=myJobQueue

Task helpers not running Laravel, but cron is running every minute

I have a task I'm trying to run every day. In my Kernel.php I have the following command:
$schedule->command('emailProgram')->daily()->timezone('US/Central');
I'm my crontab I have:
* * * * * php /var/www/html/appname/artisan schedule:run >> /dev/null 2>&1
So, When I run php artisan schedule:run, or run it directly with php artisan emailProgram it runs as expected. But, its not running on its own using daily()/dailyAt() or otherwise. Lastly, if I remove daily() from the command in the kernel.php file:
$schedule->command('emailProgram')->timezone('US/Central'); it's running every minute, so its like there is some disconnect with the Laravel task helpers. This is my first time setting up Cron, and task management with Laravel so maybe I'm overlooking something simple. Any help would be really appreciated, thanks.

Laravel Scheduler with Cron Job

I'm using larave 5.1 with php5, i try to create my cron job to remove unpaid invoice in time i want, but i testing it to print an userlog to help me know that the job is working.
This is my app/Console/Kernel.php
protected $commands = [
\App\Console\Commands\Inspire::class,
\App\Console\Commands\RemoveUnpaidInvoice::class,
];
protected function schedule(Schedule $schedule)
{
$schedule->command('removeUnpaidInvoice')->everyMinute();
// $schedule->command('inspire')->hourly();
}
And this is my RemoveUnpaidInvoice class :
public function handle()
{
UserLog::create([
'user_id' => '3',
'title' => 'Cron Testing',
'log' => 'Time : ' . date('H:i:s')
]);
}
After my files done, i run this command at my terminal to run my cron:
php artisan schedule:run
After i run schedule artisan command, then my terminal show this message :
Running scheduled command: '/usr/bin/php5' 'artisan' removeUnpaidInvoice > '/dev/null' 2>&1 &
I think it's work, then i check my database to look the userlog is created or not, and it's created, the user log is added new via cron.
But the problem is, i wait for one minute, and no userlog added, wait for 2 minute, 3 minute and more, there's no another userlog added to my database?
How to fix it? am i made a mistake??
Starting sheduler
Here is the only Cron entry you need to add to your server:
* * * * * php /path/to/artisan schedule:run >> /dev/null 2>&1
This Cron will call the Laravel command scheduler every minute. Then, >Laravel evaluates your scheduled tasks and runs the tasks that are due.
You need start the cron, no run php artisan schedule:run in the console.

Categories