Laravel queue not honouring job specific timeout - php

According to the Laravel docs I should be able to specify a job specific timeout:
If the timeout is specified on the job, it will take precedence over any timeout specified on the command line [...]
So, when I run artisan queue:listen without the --timeout option and I define the timeout inside the job (like Laravel tells me to):
public $timeout = 600;
I expect the timeout of that specific job to be 600 seconds. Unfortunately, I still get a ProcessTimedOutException. A custom timeout only works when I run the queue with --timeout=600.
I'm using Laravel 6 with PHP 7.4. As recommended by Laravel I've also enabled the pcntl PHP extension. For the queue I use the database driver with the following config:
'database' => [
'driver' => 'database',
'table' => 'jobs',
'queue' => 'default',
'retry_after' => 90,
]

I opened a bug report because I couldn't get this to work. However, it seems like the timeout specified inside the job class only takes precedence over the timeout specified in the command line when running the queue with queue:work.
I've tested this and can confirm that it works with queue:work. According to a commenter on my bug report it doesn't work with queue:listen because:
queue:listen runs several processes while queue:work is a single process. queue:listen sets a timeout for the process it runs so we don't leave ghost processes running on the machine in case the master process was killed for some reason.

Related

Laravel 5.5 Job with delay fires instantly instead of waiting

In my application I am dispatching a job on work queue with delay time. But its work instantly not waiting for delay time. In my config and eve I am using driver as database.
In my database job table not insert any job till now.
My config:
'default' => env('QUEUE_DRIVER', 'database')
My controller code:
Log::info('Request Status Check with Queues Begins', __METHOD__);
MyGetInfo::dispatch($this->name,$this->password,$this->id,$trr->id)->onQueue('work')->delay(12);
return json_encode($data);
The value of QUEUE_DRIVER must be set to database in .env file.
make sure to run this afterwards:
php artisan config:clear
also run
php artisan queue:listen

Laravel log max file is not working

I am using laravel 5.2, I just deployed my code on server. I am using laravel logs on the daily basis and so in the confg/app.php i added these two lines
'log' => 'daily'
'log_max_files' => 15
But it is not keeping the logs of 15 days. It is always keeping the logs of last 5 days only which is default file size. Am i missing something to add..?
First of all you should run
php artisan config:cache
to cache your configuration.
In addition you should run
php artisan queue:restart
to make sure your queue workers will see the changes.
Additionally you should make sure there are valid permissions to log files to make it possible to delete them.
You have to run this command on your terminal
php artisan config:cache

Laravel Queued Jobs Running for Many Minutes Longer Than Timeout

I have a Laravel queued job which extracts links from a webpage. The timeout for the Queue listener configured through Laravel Forge is 240 seconds (4 minutes). However, jobs are taking up to 45 minutes to run.
My queue settings are:
'redis' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => 'default',
'retry_after' => 350,
],
And there are multiple job processes running - up to 35 processes. As you can imagine, this is eating up a lot of server memory. The processes just seem to hanging around. The command for these processes as shown in top is:
php7.1 artisan queue:work redis --once --queue=linkqueue --delay=0 --memory=128 --sleep=10 --tries=1 --env=local
How can a job run for 45 minutes if the timeout is 240 seconds? Why are there so many processes - shouldn't there just be one?
Also, any ideas why a script for extracting links should take 45 minutes to run?!
The script does work, that is, in most cases it runs as expected - it just takes ages. There are no errors reported/logged as far as I can see.
Code in the job is:
$dom = new DOMDocument;
$dom->loadHTML($html);
$links = $dom->getElementsByTagName('a');
foreach ($links as $a) {
$link = $a->getAttribute('href');
$newurl = new URL;
$newurl->url = $link;
$newurl->save();
}
Update: Another simple job runs just fine, in under a second. It is specifically just the link job above that is taking 10s of minutes. Could it be a RAM issue or something? Is there anything else I can do to diagnose the problem? When run as part of a console job, the extract links function itself runs in 1 or 2 seconds. It is only on the queue that it freaks out.
How can a job run for 45 minutes if the timeout is 240 seconds?
Because you have 'retry_after' => 350, on your queue connection. This means if Laravel does not hear from the job after 350 seconds - it assumes the job has failed and retries again. This is resulting in multiple processes of the one job in your situation.
If you are happy to allow your jobs to run for up to 45mins - then you should set retry_after to be a larger number. Say 3600 which is 1 hour.
That way a job will only start if it takes longer than 1 hour to run.
You can also do the following to make timeouts unlimited.
php artisan queue:listen --timeout=0

Setup Remote beanstalkd Laravel 4.2

My stack set-up consists of the following:
www.main.com - Main Server (Main Application code & supervisord)
www.queue-server.com - Beanstalkd installed here (No code here only beanstalkd)
I'm using Laravel 4.2.
I have setup Supervisord on www.main.com and added the following queue listener:
php artisan queue:work--queue=test --env=test
My app/config/queue.php file settings are as below:
'beanstalkd' => array(
'driver' => 'beanstalkd',
'host' => 'www.queue-server.com',
'queue' => 'test',
'ttr' => 60,
),
From my understanding, it should push & process jobs on www.queue-server.com server but it shows no cpu spikes there, but www.main.com server shows high cpu usage.
So my questions are:
Is my setup correct? Or I have to change something?
I want to process my job on www.queue-server.com server. How can I achieve that?
The beanstalkd server is just the storage of the queue data itself, it does no processing. Its the php artisan queue:work command that then processes the queue. This is why you are seeing the higher load on your www.main.com server as although your queue is stored on the other server, the main server is the one currently processing the queue.
If you wish for the www.queue-server.com server to process the queue you need to install your application there as well and run the artisan command from there.

Queue work on "sync" driver, but not on Beanstalkd

I have a quite simple job that runs on Laravel 4 FW. When the queue driver is set as "sync", it works fine. But, when I set it to 'beanstalkd', it simply DOESN'T RUN! I already ran the artisan command php artisan queue:listen and php artisan queue:work but none seems to work.
When I type php artisan queue:work it gives me the following error:
[ErrorException]
Trying to get property of non-object
Here's my beanstalkd connection configuration:
'beanstalkd' => array(
'driver' => 'beanstalkd',
'host' => 'localhost:11300',
'queue' => 'default',
),
I've already tried to set the 'host' as a '0.0.0.0' and '127.0.0.1'.
Any ideas why isn't working?
EDIT:
Here's some piece of code of the fire() method.
static public function fire($job, $data)
{
ini_set('memory_limit', '512M');
set_time_limit(300);
$hotel_ids = $data['hotels'];
self::$client = $data['client'];
self::$currency = $data['currency'];
// A list of paths to the generated PDFs
$paths = array();
foreach ($hotel_ids as $list) {
$hotels = Hotel::whereIn('id', $list)->orderBy('name', 'asc')->get();
$paths[] = self::makePDF($hotels);
}
#self::sentPDFs($paths);
$job->delete();
}
EDIT 2:
The job itself run on sync driver, though my thoughts are on beanstalkd. I installed the beanstalkd console, a way of view the jobs and the queue grafically. Here's another interesting thing: the job is queued, he gets in the 'ready' stage then goes back! And that keeps going on! He gets in ready stage, e then (I believe) happens some sort of error and it get's out!I don't know what is the error, since it doesn't appear in SYNC drive.
Another interesting thing: if I remove all code from the fire method and lets only, for example, Log::error('Error'); it happens the same exact thing!
Have you installed Pheanstalk? It's required to use beanstalkd with the Laravel queue system.
Check your firewall configuration. I added port 11300 to the firewall tables and it works!

Categories