I have been using the Laravel framework and have just recently gotten into implementing queue's with Laravel's built in support for IronMQ.
From the Laravel documentation its easy enough to see how to push messages to a queue and then on Iron.io set subscribers and have the queue push to those subscribers. However I want to utilize IronMQ as a Pull queue. I do not see any indication of how to pull a message from a specified queue using Laravel's built in methods.
On the IronMQ site they have all the endpoints listed related to facilitate a Pull queue implementation.
Ex: /projects/{Project ID}/queues/{Queue Name}/messages
In the IronMQ package for Laravel I see the methods that seem to work with these endpoints:
/**
* Peek Messages on a Queue
* Peeking at a queue returns the next messages on the queue, but it does not reserve them.
*
* #param string $queue_name
* #return object|null message or null if queue is empty
*/
public function peekMessage($queue_name) {
$messages = $this->peekMessages($queue_name, 1);
if ($messages == null) {
return null;
} else {
return $messages[0];
}
}
However, I do not see any support for this through Laravel. I would expect to be able to do something along the lines of:
$message = Queue::peek();
Which would return the next message from a specified queue, etc.
Is there a way to do this with Laravel's built in support that is just not documented?
Thanks!
Edit:
I have seen the documentation on using Daemon Workers through Laravel, however I want to process the queue myself through a cron job.
You could try to use IronMQ class instead of laravel Queue class:
$ironmq = new \IronMQ(array(
'token' => Config::get('queue.connections.iron.token', 'xxx'),
'project_id' => Config::get('queue.connections.iron.project', 'xxx')
));
$ironmq->getMessage($queue_name);
IronMQ PHP lib
Related
I am trying to debug some bizarre behaviour of my PHP application. It is running Laravel 6 + AWS SQS. The program downloads call recordings from a VoIP provider's API using a job. The API has a heavy rate limit of 10req/minute, so I'm throttling the requests on my side. The job is configured to try to complete within 24 hours using retryUntil method. However, the job disappears from the queue after 4 tries. It doesn't fail. The job's failed method never gets executed (I've put logging and Sentry::capture in there). It's not on the failed_jobs table. The last log says "Cannot complete job, retrying in ... seconds", which is right before the release call. However, the job simply disappears from the queue and never gets executed again.
I am logging the number of attempts, max tries, timeoutAt, etc. Everything seems to be configured properly. Here's (the essence of) my code:
public function handle()
{
/** #var Track $track */
$track = Track::idOrUuId($this->trackId);
$this->logger->info('Downloading track', [
'trackId' => $track->getId(),
'attempt' => $this->attempts(),
'retryUntil' => $this->job->timeoutAt(),
'maxTries' => $this->job->maxTries(),
]);
$throttleKey = sprintf('track.download.%s', $track->getUser()->getTeamId());
if (!$this->rateLimiter->tooManyAttempts($throttleKey, self::MAX_ALLOWED_JOBS)) {
$this->downloadTrack($track);
$this->rateLimiter->hit($throttleKey, 60);
} else {
$delay = random_int(10, 100) + $this->rateLimiter->availableIn($throttleKey);
$this->logger->info('Throttling track download.', [
'trackId' => $track->getId(),
'delay' => $delay,
]);
$this->release($delay);
}
}
public function retryUntil(): DateTimeInterface
{
return now()->addHours(24);
}
public function failed(Exception $exception)
{
$this->logger->info('Job failed', ['exception' => $exception->getMessage()];
Sentry::captureException($exception);
}
I found the problem and I'm posting it here for anyone who might struggle in the future. It all came down to a simple configuration. In AWS SQS the queue I am working with has a configured DLQ (Dead-Letter Queue) and Maximum receives set to 4. According to the SQS docs
The Maximum receives value determines when a message will be sent to the DLQ. If the ReceiveCount for a message exceeds the maximum receive count for the queue, Amazon SQS moves the message to the associated DLQ (with its original message ID).
Since this is an infra configuration, it overwrites any Laravel parameters you might pass to the job. And because the message is simply removed from the queue, the processing job does not actually fail, so the failed method is not executed.
I'm adding functionality to a pre-existing app, using Laravel 5.8.38 and the SQS queue driver.
I'm looking for a way to log the receipt handle of queue messages as they're processed, so that we can manually delete messages from the queue for jobs that have gone horribly wrong (without the receipt ID, we'd have to wait for the visibility timeout to be reached).
I'm not super familiar with Laravel and am trying to figure things out as I go. We have two types of queued jobs:
a custom class implementing Illuminate\Contracts\Queue\ShouldQueue, that also uses the Illuminate\Queue\InteractsWithQueue, Illuminate\Foundation\Bus\Dispatchable and Illuminate\Bus\Queueable traits (our class gets queued directly)
a custom command, extending Illuminate\Console\Command, that runs via Illuminate\Foundation\Console\QueuedCommand
For the custom class, browsing through the source for InteractsWithQueue and Illuminate/Queue/Jobs/SqsJob I discovered I could access the receipt handle directly:
$sqsJob = $this->job->getSqsJob();
\Log::info("Processing SQS job {$sqsJob["MessageId"]} with handle {$sqsJob["ReceiptHandle"]}");
This works great! However, I can't figure out how to do a similar thing from the console command.
Laravel's QueuedCommand implements ShouldQueue as well as Illuminate\Bus\Queueable, so my current guess is that I'll need to extend this, use InteractsWithQueue, and retrieve and log the receipt handle from there. However if I do that, I can't figure out how I would modify Artisan::queue('app:command', $commandOptions); to queue my custom QueuedCommand class instead.
Am I almost there? If so, how can I queue my custom QueuedCommand class instead of the Laravel one? Or, is there a better way to do this?
Ok I had just posted this question and then realised a suggestion a colleague offered provided a way forward.
So, here's my solution in case it helps anyone else!
Laravel fires a Illuminate\Queue\Events\JobProcessing event when processing any new queue job. I just needed to register a listener in app/Providers/EventServiceProvider.php:
protected $listen = [
'Illuminate\Queue\Events\JobProcessing' => [
'App\Listeners\LogSQSJobDetails',
],
];
and then provide the listener to handle it:
namespace App\Listeners;
use Illuminate\Queue\Events\JobProcessing;
class LogSQSJobDetails
{
public function __construct()
{
}
public function handle(JobProcessing $event)
{
$sqsJob = $this->job->getSqsJob();
\Log::info("Processing SQS job {$sqsJob["MessageId"]} with handle {$sqsJob["ReceiptHandle"]}");
}
}
This works great - and means I can also now remove the addition to my custom class from earlier.
I'm using Symfony 4.4 with Redis for the session.
I have some controllers and I wish to update the DB on back ground after to send a reply to user.
So I have written a code like this:
class GetCatController extends AbstractController
{
public function getCatController(LoggerInterface $logger, ManagerRegistry $doctrine, SessionInterface $session, ValidatorInterface $validator)
{
[...]
$replyToSend = new JsonResponse($reply, 200);
$replyToSend->send();
//My back ground activity like to do some query on the db.
[...]
return null;
}
But I have some problems about the sessions.
Is there a better way to create and run background activity sending before the reply to user?
There are two decent ways to do this.
If you are running PHP under php-fpm (not mod_php), you can dispatch & catch an event, kernel.terminate In Symfony pre-4.4, this is called PostResponseEvent (TerminateEvent from 4.4/5.0).
The better choice may be with Symfony Messenger. Here, you would create a message object, with all the information needed to perform the task, and send it to a background queue (Redis is supported as a queue). A worker then consumes that message, and does the task.
I'm using Lumen (Laravel) to handle an AWS SQS queue with the sqs-plain driver for the connection. That driver works the same way as the default sqs driver but allows me to use custom JSON in the queue message body.
Based on that package I've created a job handler that works fine, but I'm having trouble understanding how to properly release a message back to the queue.
In the handler method I've injected a SqsJob and tried calling the release method which isn't working as I'd expect.
It seems that the method SqsJob::release calls changeMessageVisibility on the SQS API and the parent release method, which simply sets a class variable $this->released = true; on the job.
So the "releasing" of the message isn't really done, as much as the message is just being marked as released and the visibility of the message in the queue is changed. Not handling the release would result in the message being "handled" and deleted from the queue.
By trial and error I found out, that throwing an exception in the handler method sends the message back to the queue.
Here's a simplified version of the job handler:
namespace App\Jobs;
use Illuminate\Contracts\Queue\Job as LaravelJob;
use Illuminate\Queue\Jobs\SqsJob;
use Illuminate\Support\Facades\Log;
class HandlerJob extends Job
{
protected $data;
/**
* #param SqsJob $job
* #param array $data
*/
public function handle(SqsJob $job, array $data)
{
if ($this->validData($data)) {
// handle the job
} else {
$this->release(60);
// actually release the job back to queue
throw new \Exception('Invalid Data');
}
}
private function validData($data)
{
//not relevant
}
}
This seems like a basic task to be handled by the queue worker but I can't figure it out.
What is the proper way of releasing a message back into the the queue using the Laravel/Lumen framework?
I am wondering how to deal with this. I have a webhook endpoint which responds to a webhook call from Github.
It starts a long running process in where it clones the repository from which the webhook call was made.
/**
* The webhook endpoint.
*
* #param Request $request
* #return mixed
* #throws \Exception
*/
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
$this->cloneGitRepo($url);
return new Response('Webhook received succesfully', 200);
}
The problem with this is that Github gives an error when the response is not provided soon enough.
We couldn’t deliver this payload: Service Timeout
This is rightfully so because my cloneGitRepo method is simply blocking the execution of the response and it is taking too long.
How can I still deliver a response to acknowledge to Github that the webhook call has been made successfully and start my long running process?
I am using Laravel for all of this with Redis, maybe something can be accomplished there? I am open to all suggestions.
What you're looking for is a queued job. Laravel makes this very easy with Laravel Queues.
With queues, you setup a queue driver (database, redis, Amazon SQS, etc), and then you have one to many queue workers that are continuously running. When you put a job on the queue from your webhook method, it will be picked up by one of your queue workers and run in a separate process. The act of dispatching a queued job to a queue is very quick, though, so your webhook method will return quickly while the real work is being done by the queue worker.
The linked documentation has all the details, but the general process will be:
Setup a queue connection. You mention you're already using redis, I would start with that.
Use php artisan make:job CloneGitRepo to create a CloneGitRepo job class.
It should implement the Illuminate\Contracts\Queue\ShouldQueue interface so that Laravel knows to send this job to a queue when it is dispatched.
Make sure to define properties on the class for any data you pass into the constructor. This is necessary so the worker can rebuild the job correctly when it is pulled off the queue.
The queue worker will call the handle() method to process the job. Any dependencies can be type hinted here and they will be injected from the IoC container.
To dispatch the job to the queue, you can either use the global dispatch() helper function, or call the static dispatch() method on the job itself.
dispatch(new CloneGitRepo($url));
CloneGitRepo::dispatch($url);
So, your webhook would look like:
public function webhook(Request $request)
{
// The type of GitHub event that we receive.
$event = $request->header('X-GitHub-Event');
$url = $this->createCloneUrl();
CloneGitRepo::dispatch($url);
return new Response('Webhook received succesfully', 200);
}