Laravel - Serialization of Job to be processed on remote system - php

Introduction / System Architecture
Server A - Laravel Installation, Database A
Server B - RabbitMQ Server
Server C - Laravel Installation, Database B
Server A is an API Endpoint, only receiving calls from remote sources. Depending on the call, it'll add a job to the RabbitMQ Server (Server B), which in turn is processed/listened to by Server C.
Server A contains a local copy of the exact same job that is processed by Server C. The job handler code and constructor are shown below (Job Code).
The Issue:
Server A can not serialize the job and add it to the queue, as it attempts to access information about a monitor that only exists in Database B (Server C). Server A has a copy of the model, but does not contain the actual database tables or records as it has no use for them - it's only meant to serialize the job and say "This is what you (Server C) should be doing."
However, upon issuing the job, it's also attempting to fetch database information (likely to serialize the exact data that will be required), which it fails to do as the records don't exist there.
My understanding of Laravel's SerializesModels was specifically that it would only serialize the model call itself, without actually doing anything database related. This does not appear to function, or I am misunderstanding/using it incorrectly - although very little documentation appears to be available.
Workarounds: One possible workaround would be to simply give Server A access to the database on Server C. This is in this case not an option, as it would break the design which is intended for high availability (where the API endpoint and queue should never be unavailable, but where the queue processor might be).
The code
Relevant Job Code
// Models
use App\UptimeMonitor;
// Notifications
use App\Notifications\StatusQueue as StatusNotification;
class StatusQueue implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $monitor_source;
protected $monitor_data;
protected $monitor_repository;
/**
* Create a new job instance.
*
* #param string[] monitor_source (Eg: HetrixTools)
* #param array[] monitor_data (All data passed from the source)
*
*/
public function __construct($monitor_source, array $monitor_data)
{
$this->monitor_source = $monitor_source;
$this->monitor_data = $monitor_data;
if($this->monitor_source === 'centric')
return $this->centric();
}
/**
* Centric Uptime Monitoring
*/
public function centric()
{
$result = ($this->monitor_data['monitor_status'] == 'online') ? 'online' : 'timeout';
try {
$monitor = UptimeMonitor::where('identifier', '=', $this->monitor_data['monitor_id'])->firstOrFail();
$status = $monitor->status()->firstOrFail();
$contacts = $monitor->contacts()->get();
} catch (Exception $e) {
return Log::error('[JOBS::StatusQueue::centric] - ' . $e);
}
$status->state = $result;
if(!$contacts)
return true;
foreach($contacts as $contact) {
$contact->notify(new StatusNotification($monitor, $status));
}
}
}
Other code
If you do require any other code, let me know! This should however cover the entire functionality of the job class itself. Other than that, all that's happening is issuing that job - and how that's done is obvious based on the constructor.
Question
The final question from all of this: Why is this failing (as in; why can it not serialize the job, without needing the database information?) - and do you see a way to work around this issue, to where I do not need access to the database from Server C to queue the job from Server A, still using Laravel's Queue mechanics?
Much obliged, as always!

Turns out, the easiest solution is almost always the right one.
Server A does not need to have a replica of the job that Server B will process - it can have a completely empty job with the same class, and server B will still process it correctly.
As a result, this is now the Job on Server A:
class AlertQueue implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $job;
/**
* Create a new job instance.
*
*/
public function __construct()
{
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
//
}
}
Whilst Server B also has an AlertQueue job, which has all of the logic that will actually get performed.

Related

Laravel queue job with SerializesModels trait retrieves outdated model data

I'm using the Laravel queue through redis to send notifications. But whenever I pass models to the notifications, their properties are outdated when the notification is sent through the queue.
This is basically what I'm doing:
In controller (or somehwere else):
$thing = new Thing(); // Thing is a model in this case.
$thing->name = 'Whatever';
$thing->save();
request()->user()->notify(new SomethingHappened($thing))
SomethingHappened.php:
<?php
namespace App\Notifications;
use App\Thing;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Notifications\Messages\MailMessage;
use Illuminate\Notifications\Notification;
class SomethingHappened extends Notification implements ShouldQueue
{
use Queueable;
public $thing;
public function __construct(Thing $thing)
{
$this->thing = $thing;
}
public function via($notifiable)
{
return ['mail'];
}
public function toMail($notifiable)
{
dump($this->thing->name); // null
return (new MailMessage())
// ...
// ...
// ...
;
}
}
Now when I add a delay of a few seconds (i.e. (new SomethingHappened($thing))->delay(now()->addSeconds(5))), the retrieved model is up-to-date. Also, before I deployed enough queue workers (and they were lagging behind on a filling queue), this issue didn't exist. Therefore, it appears now that when queue job gets processed really quickly, it doesn't retrieve the model from the database correctly or the model isn't saved yet. I have no idea why this could be happening, since the save call on the model is clearly executed (synchronously) before dispatching the notification, so there should be no way it isn't saved yet when the job is processed.
I'm running on Kubernetes and using a MySQL database. Everything is on GCP (if this is relevant).
Does anyone have an idea? Adding a delay of a few seconds is not a great solution and shouldn't be necessary.
It appears our database was using too much CPU, causing it to switch to the failover regularly. I think this was the cause of the issue. After implementing tweaks reducing the database load, the problem was resolved. A workaround for this issue would be to add a slight delay (as noted in my question).
I'm not sure why this only seemed to affect notifications, and not other queue jobs or requests. If anyone has a more detailed explanation, feel free to post a new answer.

Laravel 5.7 - Kill artisan after certain amount of time

I am developing a laravel 5.7 application.
I have created a command that should setup my database:
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Artisan;
class TestSetupCommand extends Command
{
protected $signature = 'test:data';
protected $description = 'Basic Setup for Test Data';
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
Artisan::call('migrate:refresh', ['--seed' => '']);
Artisan::call('basis:cc');
Artisan::call('tick:exchange');
$this->info("DB freshly setup: DONE");
$this->info("Coin: DONE");
$this->info("Exchange Scrapped: DONE");
}
}
My problem is that each command takes several minutes to run through. In total it costs me 25 minutes to fill the whole database with data.
I would like to run the commands only for 1 minutes each and kill them afterwards.
Any suggestions how to accomplish this within my laravel command?
I think the best way to do this is to extract these commands into background job. This artisan command then becomes code to queue up that new job (or jobs).
Why? It's very easy to configure jobs to timeout after x amount of time, by overriding a value like so:
<?php
namespace App\Jobs;
class ProcessPodcast implements ShouldQueue
{
/**
* The number of seconds the job can run before timing out.
*
* #var int
*/
public $timeout = 120;
}
Also, why are you refreshing the database? That seems.... like a crazy idea unless this is purely an analytics platform (no user data at all). It's probably a bad thing if that refresh command times out - you may look into job chaining so that the refresh command is guaranteed to succeed, then the other commands (new jobs now) have set timeouts.

How do you pause a Laravel queue

I have a queue that sends requests to a remote service. Sometimes this service undergoes a maintenance. I want all queue tasks to pause and retry in 10 minutes when such situation is encountered. How do I implement that?
You can use the Queue::looping() event listener to pause an entire queue or connection (not just an individual job class). Unlike other methods, this will not put each job in a cycle of pop/requeue while the queue is paused, meaning the number of attempts will not increase.
Here's what the docs say:
Using the looping method on the Queue facade, you may specify
callbacks that execute before the worker attempts to fetch a job from
a queue.
https://laravel.com/docs/5.8/queues#job-events
What this doesn't document very well is that if the callback returns false then the worker will not fetch another job. For example, this will prevent the default queue from running:
Queue::looping(function (\Illuminate\Queue\Events\Looping $event) {
// $event->connectionName (e.g. "database")
// $event->queue (e.g. "default")
if ($event->queue == 'default') {
return false;
}
});
Note: The queue property of the event will contain the value from the command line when the worker process was started, so if your worker was checking more than one queue (e.g. artisan queue:work --queue=high,default) then the value of queue in the event will be 'high,default'. As a precaution, you may instead want to explode the string by commas and check if default is in the list.
So for example, if you want to create a rudimentary circuit breaker to pause the mail queue when your mail service returns a maintenance error, then you can register a listener like this in your EventServiceProvider.php:
/**
* Register any events for your application.
*
* #return void
*/
public function boot()
{
parent::boot();
Queue::looping(function (\Illuminate\Queue\Events\Looping $event) {
if (($event->queue == 'mail') && (cache()->get('mail-queue-paused'))) {
return false;
}
});
}
This assumes you have a mechanism somewhere else in your application to detect the appropriate situation and, in this example, that mechanism would need to assign a value to the mail-queue-paused key in the shared cache (because that's what my code is checking for). There are much more robust solutions, but setting a specific well-known key in the cache (and expiring it automatically) is simple and achieves the desired effect.
<?php
namespace App\Jobs;
use ...
class SendRequest implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
const REMOTE_SERVER_UNAVAILABLE = 'remote_server_unavailable';
private $msg;
private $retryAfter;
public function __construct($msg)
{
$this->msg = $msg;
$this->retryAfter = 10;
}
/**
* Execute the job.
*
* #return void
*/
public function handle(){
try {
// if we have tried sending the request and get a RemoteServerException, we will
// redispatch the job directly and return.
if(Cache::get(self::REMOTE_SERVER_UNAVAILABLE)) {
self::dispatch($this->msg)->delay(Carbon::now()->addMinutes($this->retryAfter));
return;
}
// send request to remote server
// ...
} catch (RemoteServerException $e) {
// set a cache value expires in 10 mins if not exists.
Cache::add(self::REMOTE_SERVER_UNAVAILABLE,'1', $this->retryAfter);
// if the remote service undergoes a maintenance, redispatch a new delayed job.
self::dispatch($this->msg)->delay(Carbon::now()->addMinutes($this->retryAfter));
}
}
}

Laravel Job deserialzation wrong class

I've got a problem while using Laravel's job Deserialization.
This is the Job class that is queued in the database:
class SendRatingEmail implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $order, $user;
public function __construct(User $user, Order $order)
{
$this->order = $order;
$this->user = $user;
}
public function handle()
{
if ($this->order->isRatedByUser($this->user->id)) {
return;
}
Mail::to($this->user->email)->queue(new RatingEmail($this->order, $this->user));
}
}
In the class Order.php, I dispatch this job like this:
class Order {
function queueRating()
{
$when = Carbon::now()->addDays(env('ORDER_DAYS_RATING', 8));
dispatch((new SendRatingEmail($this->buyer, $this))->delay($when));
}
}
So the problem is in the job's handle() function, specifically the error is:
Call to undefined method Illuminate\Database\Query\Builder::isRatedByUser()
It seems as though Laravel gives me the wrong object, instead of App\Order it gives me the QueryBuilder. In the queueRating() function I have checked that the types given in the constructor are the expected types. I have even tested a workaround which also didn't seem to work:
if($this->order instanceof \Illuminate\Database\Query\Builder) {
$this->order = $this->order->first();
}
Also I have looked in the jobs table, and it seems as if the saved models are correct (App\Order)
Edit
Here is the code where the queueRating() function is called. The file is StripeController which handles credit card payments.
public function orderPaid($order) {
$order->payment_done = 1;
$order->save();
$order->chat->open = 1;
$order->chat->save();
$order->queueRating();
}
I found the problem, as always the problem is not where I looked. As it turns out, the code is completely fine, but I forgot to restart the queue worker on the staging server, which meant that changes in the code base were not applied in the queue worker. So the application and queue used different versions.
It is even stated in the Laravel Queue Documentation:
Remember, queue workers are long-lived processes and store the booted application state in memory. As a result, they will not notice changes in your code base after they have been started. So, during your deployment process, be sure to restart your queue workers.
I use the bugtracker Bugsnag, which shows the error and also the line numbers in the code. I had noticed that the line number where the error occurred mismatched the real line number, but couldn't figure out that this was causing the problem.

Laravel job calling class from nowhere?

I have a job in my Laravel project (v.4.2). which is used for inserting data into database. The class named "ProductUpdate". I use Amazon SQS for queue service.
What makes me confuse now is, when I changed the code in class "ProductUpdate",
it seems that the job is running by using old version of the class.
I even deleted all lines of code in the class but the jobs can still be able to run ( it stills inserts data).
Following is the job class.
The file of this class is at app/jobs/ProductUpdate.php
In my understanding, job class is the only place that will be called from queue, but why it can still be able to run when I deleted all the codes?
<?php
/**
* Here is a class to run a queued item sent from SQS
* Default method to use is fire()
**/
class ProductUpdate
{
public function fire($job, $data)
{
// Disable query log
DB::connection()->disableQueryLog();
// Set the job as a var so it will be used across functions
$this->job = $job;
$product = Product::find($productID);
if($product->product_type != 18) {
// Call the updater from library
$updater = App::make('Product\PriceUpdater');
$updater->update($product);
}
// Done and delete
$this->success();
}
private function success()
{
$this->job->delete();
}
private function fail($messages = array())
{
Log::error('Job processing fail', $messages);
$this->job->delete();
}
}
Your problem is related to cache.
Run this command in terminal to remove all cached data.
php artisan cache:clear
other way:-
Illuminate\Cache\FileStore has the function flush, so you can also use it:
Cache::flush();
This link will also help you :)

Categories