I would like to modify a session variable when my queue job has finished. I found in laravel documentation, that Queue::after is created for my issue, but I can not find out how to use it.
I start the job from a controller: VideoController.php
$job = (new VideoConvertJob($newFileName))->delay(Carbon::now()->addSeconds(5)); dispatch($job);
There are some code in the job (VideoConvertJob.php) handle method:
public function handle() { ... }
But I do not know, where and how should I imlement the Queue::after method, to know that job has finished succesfully and update my session.
Related
when I create a job and before start it , I need run a function ,
i update the class DatabaseJob
<?php
namespace App\Queue\Jobs;
use App\Models\Tenancy;
use App\Http\DatabaseHelper;
use App\Http\Helper;
class DatabaseJob extends \Illuminate\Queue\Jobs\DatabaseJob
{
public function fire()
{
Helper::createJobLog($this->job->id);
parent::fire();
}
}
but it seems the function createJobLog is fired only when the Job start ,I need it when the job created not started .
In a service provider you can listen for the Illuminate\Queue\Events\JobQueued event. Something similar to:
Event::listen(JobQueued::class, function ($event) {
// Of course, if you need to log only the database jobs then you can check the job type
if (!$event->job instanceOf DatabaseJob) {
return;
}
Helper::createJobLog($event->job->getJobId());
});
You may call the function createJobLog() when the job is dispatched. Jobs can be set with a timestamp to delay its start time, if you don’t want the job started immediately after it is being dispatched.
Below is what's happening when i run php artisan queue:listen and at my job table only have one job
and this is my code :
public function handle(Xero $xero)
{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}
In order for a job to leave the queue, it must reach the end of the handle function -- without errors and exceptions.
There must be something breaking inside one or more of your functions.
If an exception is thrown while the job is being processed, the job will automatically be released back onto the queue so it may be attempted again. https://laravel.com/docs/5.8/queues
The same behavior can be achieved with
$this->release()
If you can't figure out what is breaking, you can set your job to run only once. If an error is thrown, the job will be considered failed and will be put in the failed jobs queue.
The maximum number of attempts is defined by the --tries switch used
on the queue:work Artisan command. https://laravel.com/docs/5.8/queues
php artisan queue:work --tries=1
If you are using the database queue, (awesome for debugging) run this command to create the failed queue table
php artisan queue:failed
Finally, to find out what is wrong with your code. You can catch and log the error.
public function handle(Xero $xero)
{
try{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}catch(\Exception $e){
Log::error($e->getMessage());
}
}
You could also set your error log channel to be slack, bugsnag or whatever. Just be sure to check it. Please don't be offended, it's normal to screw up when dealing with laravel queues. How do you think I got here?
Laravel try to run the job again and again.
php artisan queue:work --tries=3
Upper command will only try to run the jobs 3 times.
Hope this helps
In my case the problem was the payload, I've created the variable private, but it needs to by protected.
class EventJob implements ShouldQueue
{
use InteractsWithQueue, Queueable, SerializesModels;
// payload
protected $command;
// Maximum tries of this event
public $tries = 5;
public function __construct(CommandInterface $command)
{
$this->command = $command;
}
public function handle()
{
$event = I_Event::create([
'event_type_id' => $command->getEventTypeId(),
'sender_url' => $command->getSenderUrl(),
'sender_ip' => $command->getSenderIp()
]);
return $event;
}
}
The solution that worked for me to delete the job after pushing them into the queue.
Consider the e.g.
class SomeController extends Controller{
public function uploadProductCsv(){
//process file here and push the code inot Queue
Queue::push('SomeController#processFile', $someDataArray);
}
public function processFile($job, $data){
//logic to process the data
$job->delete(); //after complete the process delete the job
}
}
Note: This is implemented for laravel 4.2
I am wondering how to deal with this. I have a webhook endpoint which responds to a webhook call from Github.
It starts a long running process in where it clones the repository from which the webhook call was made.
/**
* The webhook endpoint.
*
* #param Request $request
* #return mixed
* #throws \Exception
*/
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
$this->cloneGitRepo($url);
return new Response('Webhook received succesfully', 200);
}
The problem with this is that Github gives an error when the response is not provided soon enough.
We couldn’t deliver this payload: Service Timeout
This is rightfully so because my cloneGitRepo method is simply blocking the execution of the response and it is taking too long.
How can I still deliver a response to acknowledge to Github that the webhook call has been made successfully and start my long running process?
I am using Laravel for all of this with Redis, maybe something can be accomplished there? I am open to all suggestions.
What you're looking for is a queued job. Laravel makes this very easy with Laravel Queues.
With queues, you setup a queue driver (database, redis, Amazon SQS, etc), and then you have one to many queue workers that are continuously running. When you put a job on the queue from your webhook method, it will be picked up by one of your queue workers and run in a separate process. The act of dispatching a queued job to a queue is very quick, though, so your webhook method will return quickly while the real work is being done by the queue worker.
The linked documentation has all the details, but the general process will be:
Setup a queue connection. You mention you're already using redis, I would start with that.
Use php artisan make:job CloneGitRepo to create a CloneGitRepo job class.
It should implement the Illuminate\Contracts\Queue\ShouldQueue interface so that Laravel knows to send this job to a queue when it is dispatched.
Make sure to define properties on the class for any data you pass into the constructor. This is necessary so the worker can rebuild the job correctly when it is pulled off the queue.
The queue worker will call the handle() method to process the job. Any dependencies can be type hinted here and they will be injected from the IoC container.
To dispatch the job to the queue, you can either use the global dispatch() helper function, or call the static dispatch() method on the job itself.
dispatch(new CloneGitRepo($url));
CloneGitRepo::dispatch($url);
So, your webhook would look like:
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
CloneGitRepo::dispatch($url);
return new Response('Webhook received succesfully', 200);
}
I would like to push another job onto the queue after my first one has successfully executed. From my ShopController.php I fire:
$this->dispatch(new ItemPurchased($myVariables));
In ItemPurchased.php:
public function handle()
{
// some code that charges a user's credit card
}
How do i fire a subsequent job upon success of ItemPurchased.php?
From the Laravel Documentation, you would do something like this in a ServiceProvider:
{
Queue::after(function (JobProcessed $event) {
// $event->connectionName
// $event->job
// $event->data
});
}
But how do i specify, after ItemPurchase.php was successfuly, then dispatch another job? There is a failed() method for code to run when jobs fail. Is there a success() method that hasn't been mentioned before?
Jobs run offline through the Queue driver. However, once the handle method is called it follows the normal sync code execution.
So simply at the end of handle method just fire the new job:
public function handle()
{
//Some code here
dispatch(new JobClass());
}
I want to do something like this in my fire method:
class MyClass{
public function fire($job) {
if(something) {
$job->fail();
}else {
//processing
}
$job->delete();
}
There is no such method as fail(), is it possible to do something like this?
There is no such thing as fail a job but what you can do:
release it back to the queue with
$job->release();
After defined number of attempts it will end up in failed jobs table.
throw an exception. The job will be released back to the queue on it's own.
if you're using beanstalkd as a queue driver you can bury a job
$job->bury();
If your condition is unrecoverable you can log this fact and simply delete the job.