The problem
I'm dispatching a job to execute an action that needs a resource ready to be correctly executed, so if it fails it needs to be retried after some time.
But what really happens is that if it fails, it's not executed ever again.
I'm using Supervisor to manage the queue and the database driver and I have changed nothing in my default queue.php config file.
Using Laravel 5.8.
What I've tried
I've already tried to manually set the number of tries inside the job class like
public $tries = 5;
and also the same thing with retry delay with
public $retryAfter = 60;
My code
I'm implementing this job based on the default job template made with make:job, and my constructor and handle methods look like this:
public function __construct($event, $data)
{
$this->event = $event;
$this->data = $data;
}
public function handle()
{
Log::info('Job started | ' . $this->event . ' | Attempt: ' . $this->attempts());
// Executes some logic and throws an Exception if it fails
Log::info('Job succeeded | ' . $this->event);
}
Finally it doesn't reach "job succeeded" log and doesn't log any other attempt other than 1.
Is there some concept I'm missing or is this code wrong somehow?
That was a very stupid problem actually. But if anyone is as dumb as me ever, I'll let the solution here.
In my queue.php default driver was set as sync which causes the program to just run the job handle method but it does not queue it, 'cause as I've said I didn't change anything. So I just set it as database and it was fixed.
Related
I would like to modify a session variable when my queue job has finished. I found in laravel documentation, that Queue::after is created for my issue, but I can not find out how to use it.
I start the job from a controller: VideoController.php
$job = (new VideoConvertJob($newFileName))->delay(Carbon::now()->addSeconds(5)); dispatch($job);
There are some code in the job (VideoConvertJob.php) handle method:
public function handle() { ... }
But I do not know, where and how should I imlement the Queue::after method, to know that job has finished succesfully and update my session.
I need to modify the mail SMTP parameters such as MAIL_HOST and MAIL_USERNAME dynamically.
For, this I am using Config::set() to set these values dynamically.
# This code works
Config::set('mail.host', 'smtp.gmail.com');
Mail::to('user#example.com')->send(new myMailable());
The above code works if I do not queue the mail.
The moment I queue it, it appears that Config::set() fails to set the values.
Test to confirm Config::set() not working with queued jobs -
I created a simple job and put the below code in the handler.
public function handle()
{
# set the config
Config::set('mail.host', 'smtp.gmail.com');
# confirm config has been set correctly
logger('Setting host to = [' . config('mail.host') . ']');
}
The above code creates the below log entry.
Setting host to = []
Why can I not change the Config on-the-fly for queued jobs? And how to solve this?
This is because the Queue worker doesn't use the current request. It is a stand-alone process, not being interferred by config settings.
To let this work, you need to use a Job. The dispatch function takes your data and sends it through to the job itself. From your controller, call:
JobName::dispatch($user, $settings);
In the job you set the variables accordingly:
public function __construct($user, $settings)
{
$this->user = $user;
$this->settings = $settings;
}
Then in the handle method:
\Notification::sendNow($this->user, new Notification($this->settings));
You can use a normal notification for this.
Do not for get to add implements ShouldQueue to your job!
Below is what's happening when i run php artisan queue:listen and at my job table only have one job
and this is my code :
public function handle(Xero $xero)
{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}
In order for a job to leave the queue, it must reach the end of the handle function -- without errors and exceptions.
There must be something breaking inside one or more of your functions.
If an exception is thrown while the job is being processed, the job will automatically be released back onto the queue so it may be attempted again. https://laravel.com/docs/5.8/queues
The same behavior can be achieved with
$this->release()
If you can't figure out what is breaking, you can set your job to run only once. If an error is thrown, the job will be considered failed and will be put in the failed jobs queue.
The maximum number of attempts is defined by the --tries switch used
on the queue:work Artisan command. https://laravel.com/docs/5.8/queues
php artisan queue:work --tries=1
If you are using the database queue, (awesome for debugging) run this command to create the failed queue table
php artisan queue:failed
Finally, to find out what is wrong with your code. You can catch and log the error.
public function handle(Xero $xero)
{
try{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}catch(\Exception $e){
Log::error($e->getMessage());
}
}
You could also set your error log channel to be slack, bugsnag or whatever. Just be sure to check it. Please don't be offended, it's normal to screw up when dealing with laravel queues. How do you think I got here?
Laravel try to run the job again and again.
php artisan queue:work --tries=3
Upper command will only try to run the jobs 3 times.
Hope this helps
In my case the problem was the payload, I've created the variable private, but it needs to by protected.
class EventJob implements ShouldQueue
{
use InteractsWithQueue, Queueable, SerializesModels;
// payload
protected $command;
// Maximum tries of this event
public $tries = 5;
public function __construct(CommandInterface $command)
{
$this->command = $command;
}
public function handle()
{
$event = I_Event::create([
'event_type_id' => $command->getEventTypeId(),
'sender_url' => $command->getSenderUrl(),
'sender_ip' => $command->getSenderIp()
]);
return $event;
}
}
The solution that worked for me to delete the job after pushing them into the queue.
Consider the e.g.
class SomeController extends Controller{
public function uploadProductCsv(){
//process file here and push the code inot Queue
Queue::push('SomeController#processFile', $someDataArray);
}
public function processFile($job, $data){
//logic to process the data
$job->delete(); //after complete the process delete the job
}
}
Note: This is implemented for laravel 4.2
I have a php function which gets called when someone visits POST www.example.com/webhook. However, the external service which I cannot control, sometimes calls this url twice in rapid succession, messing with my logic since the webhook persists stuff in the database which takes a few ms to complete.
In other words, when the second request comes in (which can not be ignored), the first request is likely not completed yet however I need this to be completed in the order it came in.
So I've created a little hack in Laravel which should "throttle" the execution with 5 seconds in between. It seems to work most of the time. However an error in my code or some other oversight, does not make this solution work everytime.
function myWebhook() {
// Check if cache value (defaults to 0) and compare with current time.
while(Cache::get('g2a_webhook_timestamp', 0) + 5 > Carbon::now()->timestamp) {
// Postpone execution.
sleep(1);
}
// Create a cache value (file storage) that stores the current
Cache::put('g2a_webhook_timestamp', Carbon::now()->timestamp, 1);
// Execute rest of code ...
}
Anyone perhaps got a watertight solution for this issue?
You have essentially designed your own simplified queue system which is the right approach but you can make use of the native Laravel queue to have a more robust solution to your problem.
Define a job, e.g: ProcessWebhook
When a POST request is received to /webhook queue the job
The laravel queue worker will process one job at a time[1] in the order they're received, ensuring that no matter how many requests are received, they'll be processed one by one and in order.
The implementation of this would look something like this:
Create a new Job, e.g: php artisan make:job ProcessWebhook
Move your webhook processing code into the handle method of the job, e.g:
public function __construct($data)
{
$this->data = $data;
}
public function handle()
{
Model::where('x', 'y')->update([
'field' => $this->data->newValue
]);
}
Modify your Webhook controller to dispatch a new job when a POST request is received, e.g:
public function webhook(Request $request)
{
$data = $request->getContent();
ProcessWebhook::dispatch($data);
}
Start your queue worker, php artisan queue:work, which will run in the background processing jobs in the order they arrive, one at a time.
That's it, a maintainable solution to processing webhooks in order, one-by-one. You can read the Queue documentation to find out more about the functionality available, including retrying failed jobs which can be very useful.
[1] Laravel will process one job at a time per worker. You can add more workers to improve queue throughput for other use cases but in this situation you'd just want to use one worker.
I want to do something like this in my fire method:
class MyClass{
public function fire($job) {
if(something) {
$job->fail();
}else {
//processing
}
$job->delete();
}
There is no such method as fail(), is it possible to do something like this?
There is no such thing as fail a job but what you can do:
release it back to the queue with
$job->release();
After defined number of attempts it will end up in failed jobs table.
throw an exception. The job will be released back to the queue on it's own.
if you're using beanstalkd as a queue driver you can bury a job
$job->bury();
If your condition is unrecoverable you can log this fact and simply delete the job.