Releasing AWS SQS Message back to queue from Laravel/Lumen - php

I'm using Lumen (Laravel) to handle an AWS SQS queue with the sqs-plain driver for the connection. That driver works the same way as the default sqs driver but allows me to use custom JSON in the queue message body.
Based on that package I've created a job handler that works fine, but I'm having trouble understanding how to properly release a message back to the queue.
In the handler method I've injected a SqsJob and tried calling the release method which isn't working as I'd expect.
It seems that the method SqsJob::release calls changeMessageVisibility on the SQS API and the parent release method, which simply sets a class variable $this->released = true; on the job.
So the "releasing" of the message isn't really done, as much as the message is just being marked as released and the visibility of the message in the queue is changed. Not handling the release would result in the message being "handled" and deleted from the queue.
By trial and error I found out, that throwing an exception in the handler method sends the message back to the queue.
Here's a simplified version of the job handler:
namespace App\Jobs;
use Illuminate\Contracts\Queue\Job as LaravelJob;
use Illuminate\Queue\Jobs\SqsJob;
use Illuminate\Support\Facades\Log;
class HandlerJob extends Job
{
protected $data;
/**
* #param SqsJob $job
* #param array $data
*/
public function handle(SqsJob $job, array $data)
{
if ($this->validData($data)) {
// handle the job
} else {
$this->release(60);
// actually release the job back to queue
throw new \Exception('Invalid Data');
}
}
private function validData($data)
{
//not relevant
}
}
This seems like a basic task to be handled by the queue worker but I can't figure it out.
What is the proper way of releasing a message back into the the queue using the Laravel/Lumen framework?

Related

Accessing SQS job data (receipt handle) from a Laravel queued console command

I'm adding functionality to a pre-existing app, using Laravel 5.8.38 and the SQS queue driver.
I'm looking for a way to log the receipt handle of queue messages as they're processed, so that we can manually delete messages from the queue for jobs that have gone horribly wrong (without the receipt ID, we'd have to wait for the visibility timeout to be reached).
I'm not super familiar with Laravel and am trying to figure things out as I go. We have two types of queued jobs:
a custom class implementing Illuminate\Contracts\Queue\ShouldQueue, that also uses the Illuminate\Queue\InteractsWithQueue, Illuminate\Foundation\Bus\Dispatchable and Illuminate\Bus\Queueable traits (our class gets queued directly)
a custom command, extending Illuminate\Console\Command, that runs via Illuminate\Foundation\Console\QueuedCommand
For the custom class, browsing through the source for InteractsWithQueue and Illuminate/Queue/Jobs/SqsJob I discovered I could access the receipt handle directly:
$sqsJob = $this->job->getSqsJob();
\Log::info("Processing SQS job {$sqsJob["MessageId"]} with handle {$sqsJob["ReceiptHandle"]}");
This works great! However, I can't figure out how to do a similar thing from the console command.
Laravel's QueuedCommand implements ShouldQueue as well as Illuminate\Bus\Queueable, so my current guess is that I'll need to extend this, use InteractsWithQueue, and retrieve and log the receipt handle from there. However if I do that, I can't figure out how I would modify Artisan::queue('app:command', $commandOptions); to queue my custom QueuedCommand class instead.
Am I almost there? If so, how can I queue my custom QueuedCommand class instead of the Laravel one? Or, is there a better way to do this?
Ok I had just posted this question and then realised a suggestion a colleague offered provided a way forward.
So, here's my solution in case it helps anyone else!
Laravel fires a Illuminate\Queue\Events\JobProcessing event when processing any new queue job. I just needed to register a listener in app/Providers/EventServiceProvider.php:
protected $listen = [
'Illuminate\Queue\Events\JobProcessing' => [
'App\Listeners\LogSQSJobDetails',
],
];
and then provide the listener to handle it:
namespace App\Listeners;
use Illuminate\Queue\Events\JobProcessing;
class LogSQSJobDetails
{
public function __construct()
{
}
public function handle(JobProcessing $event)
{
$sqsJob = $this->job->getSqsJob();
\Log::info("Processing SQS job {$sqsJob["MessageId"]} with handle {$sqsJob["ReceiptHandle"]}");
}
}
This works great - and means I can also now remove the addition to my custom class from earlier.

Error using SQS with multiple Laravel queue readers

I am using Laravel Jobs to read messages from an SQS queue (Laravel version 5.7)
Following Laravel indications I am using supervisor to run multiple queue:work processes at the same time.
All goes well until I get this SQS error related to the message availability:
InvalidParameterValue (client): Value
... for parameter ReceiptHandle is invalid. Reason: Message does not exist or
is not available for visibility timeout change. - <?xml version="1.0"?>
<ErrorResponse xmlns="http://queue.amazonaws.com/doc/2012-11-05/"><Error>
<Type>Sender</Type><Code>InvalidParameterValue</Code><Message>Value ...
for parameter ReceiptHandle is invalid. Reason: Message does not exist or is
not available for visibility timeout change.</Message><Detail/></Error>
<RequestId>8c1d28b7-a02c-5059-8b65-7c6292a0e56e</RequestId></ErrorResponse>
{"exception":"[object] (Aws\\Sqs\\Exception\\SqsException(code: 0): Error
executing \"ChangeMessageVisibility\" on \"https://sqs.eu-central-
1.amazonaws.com/123123123123/myQueue\"; AWS HTTP error: Client error: `POST
https://sqs.eu-central-1.amazonaws.com/123123123123/myQueue` resulted in a
`400 Bad Request` response:
In particular, the strange thing is Message does not exist or is not available for visibility timeout change.
Each supervisor process calls command=php /home/application/artisan queue:work without a --sleep=3 (I'd like the process to be reactive and not waiting for 3 seconds in case nothing was in the queue) nor a --tries=3 (I need all the tasks to be completed, so I don't put a limit to the tries parameter)
In case the message is not existing (and I can't exclude this possibility) why does the process fetches it from the queue ? Is there anything I can do to prevent it ?
I've seen this error intermittently in production too, where we run a good number of consumers for a single SQS queue. In our case, I'm pretty convinced that the error is due to SQS's at-least-once delivery semantics. Essentially, a message can be delivered twice or more on rare occasions.
Laravel's queue worker command isn't strictly idempotent because it will throw an exception when trying to release or delete an SQS message that is no longer available (i.e., because it has been deleted by another queue worker process, which received a duplicate of the message from SQS).
Our workaround is to try to detect when a duplicate message has been received, and then attempt to safely release the message back onto the queue. If the other queue worker that is currently working on the message succeeds, it will delete the message, and it won't be received again. If the other queue worker fails, then the message will be released and received again later. Something like this:
<?php
use Aws\Sqs\Exception\SqsException;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\Cache;
use Illuminate\Support\Str;
class ProcessPodcast implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
private $jobId;
public function __construct($jobId)
{
$this->jobId = $jobId;
}
public function handle()
{
$acquired = Cache::lock("process-podcast-$this->jobId")->get(function () {
// Process the podcast (NB: this should be idempotent)
});
if (!$acquired) {
$this->releaseDuplicateMessage($delay = 60);
}
}
private function releaseDuplicateMessage($delay)
{
try {
$this->release($delay);
} catch (Exception $ex) {
if (!$this->causedByMessageNoLongerAvailable($ex)) {
throw $ex;
}
}
}
private function causedByMessageNoLongerAvailable(Exception $ex): bool
{
return $ex instanceof SqsException &&
Str::contains(
$ex->getAwsErrorMessage(),
"Message does not exist or is not available for visibility timeout change"
);
}
}
Another potential for these duplicate messages is that SQS has a default Visibility Timeout of 30secs.
Visibility timeout sets the length of time that a message received from a queue (by one consumer) will not be visible to the other message consumers.
So if one worker reads a message from the queue and it takes longer than 30secs to process, the message will become visible again and another worker will start processing.
When the first worker finishes, it will delete it from the queue. Then when the second worker finishes processing the same message and tries to delete it, it can't because the first worker already deleted it.
We're having the same issue at the moment and are implementing a fix/workaround similar to Louis. Will post our version when done and confirmed working.
Note: You can increase the Visibility Timeout on SQS.

Send response but keep long running script going to prevent timeout?

I am wondering how to deal with this. I have a webhook endpoint which responds to a webhook call from Github.
It starts a long running process in where it clones the repository from which the webhook call was made.
/**
* The webhook endpoint.
*
* #param Request $request
* #return mixed
* #throws \Exception
*/
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
$this->cloneGitRepo($url);
return new Response('Webhook received succesfully', 200);
}
The problem with this is that Github gives an error when the response is not provided soon enough.
We couldn’t deliver this payload: Service Timeout
This is rightfully so because my cloneGitRepo method is simply blocking the execution of the response and it is taking too long.
How can I still deliver a response to acknowledge to Github that the webhook call has been made successfully and start my long running process?
I am using Laravel for all of this with Redis, maybe something can be accomplished there? I am open to all suggestions.
What you're looking for is a queued job. Laravel makes this very easy with Laravel Queues.
With queues, you setup a queue driver (database, redis, Amazon SQS, etc), and then you have one to many queue workers that are continuously running. When you put a job on the queue from your webhook method, it will be picked up by one of your queue workers and run in a separate process. The act of dispatching a queued job to a queue is very quick, though, so your webhook method will return quickly while the real work is being done by the queue worker.
The linked documentation has all the details, but the general process will be:
Setup a queue connection. You mention you're already using redis, I would start with that.
Use php artisan make:job CloneGitRepo to create a CloneGitRepo job class.
It should implement the Illuminate\Contracts\Queue\ShouldQueue interface so that Laravel knows to send this job to a queue when it is dispatched.
Make sure to define properties on the class for any data you pass into the constructor. This is necessary so the worker can rebuild the job correctly when it is pulled off the queue.
The queue worker will call the handle() method to process the job. Any dependencies can be type hinted here and they will be injected from the IoC container.
To dispatch the job to the queue, you can either use the global dispatch() helper function, or call the static dispatch() method on the job itself.
dispatch(new CloneGitRepo($url));
CloneGitRepo::dispatch($url);
So, your webhook would look like:
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
CloneGitRepo::dispatch($url);
return new Response('Webhook received succesfully', 200);
}

Laravel 4.2 Queue - force job fail

I want to do something like this in my fire method:
class MyClass{
public function fire($job) {
if(something) {
$job->fail();
}else {
//processing
}
$job->delete();
}
There is no such method as fail(), is it possible to do something like this?
There is no such thing as fail a job but what you can do:
release it back to the queue with
$job->release();
After defined number of attempts it will end up in failed jobs table.
throw an exception. The job will be released back to the queue on it's own.
if you're using beanstalkd as a queue driver you can bury a job
$job->bury();
If your condition is unrecoverable you can log this fact and simply delete the job.

Laravel Pull Queue

I have been using the Laravel framework and have just recently gotten into implementing queue's with Laravel's built in support for IronMQ.
From the Laravel documentation its easy enough to see how to push messages to a queue and then on Iron.io set subscribers and have the queue push to those subscribers. However I want to utilize IronMQ as a Pull queue. I do not see any indication of how to pull a message from a specified queue using Laravel's built in methods.
On the IronMQ site they have all the endpoints listed related to facilitate a Pull queue implementation.
Ex: /projects/{Project ID}/queues/{Queue Name}/messages
In the IronMQ package for Laravel I see the methods that seem to work with these endpoints:
/**
* Peek Messages on a Queue
* Peeking at a queue returns the next messages on the queue, but it does not reserve them.
*
* #param string $queue_name
* #return object|null message or null if queue is empty
*/
public function peekMessage($queue_name) {
$messages = $this->peekMessages($queue_name, 1);
if ($messages == null) {
return null;
} else {
return $messages[0];
}
}
However, I do not see any support for this through Laravel. I would expect to be able to do something along the lines of:
$message = Queue::peek();
Which would return the next message from a specified queue, etc.
Is there a way to do this with Laravel's built in support that is just not documented?
Thanks!
Edit:
I have seen the documentation on using Daemon Workers through Laravel, however I want to process the queue myself through a cron job.
You could try to use IronMQ class instead of laravel Queue class:
$ironmq = new \IronMQ(array(
'token' => Config::get('queue.connections.iron.token', 'xxx'),
'project_id' => Config::get('queue.connections.iron.project', 'xxx')
));
$ironmq->getMessage($queue_name);
IronMQ PHP lib

Categories