laravel queue and prometheus - php

I am working on a project which runs Laravel queues. We need to expose several metrics to the Prometheus server. Given Prometheus is a pull model, is there a way for Laravel queues to run a web server while running queues worker?
I know there is Prometheus push gateway, but looking for other alternatives

Related

Creating scheduled jobs in a Multi-Tenant application

I am building a Multi-Tenant web application using Laravel/PHP that will be hosted on AWS as SaaS at the end. I have around 15-20 different background jobs that need scheduling for each tenant. The jobs need to be fired every 5 minutes as well. Thus the number of jobs which need to be fired for 100 tenants would be around 2000. I am left with 2 challenges in achieving this
Is there a cloud solution that distributes and manages the load of the scheduled jobs automatically?
If one is out there, how can we create those 15+ scheduled jobs on the fly? Is there an API available?
Looking for your assistance
Finally, I have found a solution to my problem.
We cannot scale the background jobs in the way I want. It required me to look into the solution from a completely different angle.
The ideal solution to my problem is that I should generate SQS messages (with a payload describing the tenant id, the job needs to be executed and any additional parameters) corresponding to the number of tenants on a set interval and queue it.
For example, if I have 100 tenants and I want to run "Job 1" every our, the main application will generate 100 SQS messages and queue it in a particular SQS Queue every hour. It will do the same for all 15 different jobs I have per tenant.
On the other end, a scalable AWS Lambda function listening to the SQS queue will pick up the payload and execute the intended task based on the data being carried by the payload.
But unfortunately, my expertise lies in PHP/Laravel technology which is still not in the AWS Lambda stack. Hence I figured out a workaround as follows.
I built a Docker image with my PHP/Laravel application and placed it in Amazon ECS (EC2 container service). Still, I have the AWS Lambda function in place but this time it acts as a trigger to my docker containers. The Lambda picks an SQS Message, processes the payload and spawns a Docker container on ECS based on my Docker image. I got some of the ideas from the following article to arrive at this solution.
https://aws.amazon.com/blogs/compute/better-together-amazon-ecs-and-aws-lambda/
Laravel has option to schedule Task/Jobs:
Refer: https://laravel.com/docs/6.x/scheduling
so you can keep jobs of your client in your database and than do it some like below:
Scheduling Queued Jobs
The job method may be used to schedule a queued job. This method provides a convenient way to schedule jobs without using the call method to manually create Closures to queue the job:
$schedule->job(new ClientJob)->everyFiveMinutes();
// Dispatch the job to the "clientjob" queue...
$schedule->job(new ClientJob, 'clientjob')->everyFiveMinutes();
or
Scheduling Shell Commands
The exec method may be used to issue a command to the operating system:
$schedule->exec('node /home/forge/script.js')->everyFiveMinutes();

Using laravel queue with cron jobs on shared hosting

In this moment i have a shared hosting server.
In my app i want to use Laravel's queue system, but i can't maintain the command php artisan queue:work beacuse i can't install a supervisor.
With a little bit of an effort, i can move my app on a VPS, but i don't have much of experience with servers and i'm a little scared that my app will be offline much time.
Considering lack of experience on server side, i have this questions:
Is it ok to use Laravel queues with cron jobs? Can it break in any way?
Only for this problem should i upgrade to an VPS or i should remain on this shared hosting server ( i have ssh access here )?
Quick answer: you should not use Laravel queues without a process monitor such as Supervisor.
It all depends of what you want to achieve, but an alternative to queues would be using the laravel scheduler: you can trigger the scheduler with a cron task (every minute for example), and dispatch jobs easily.
And if you really want to use the queues, a solution could be to add your jobs to the queue, and process them using a cron task every minute running the following command: php artisan queue:work. But I would recommend the previous solution.

Laravel Queue Worker, RabbitMQ and running jobs generated remotely

I'll preface this by admitting slight sleep-deprivation.
The setup is as follows:
API Endpoint (Server A) receives an incoming call, and adds this to a specific queue on the RabbitMQ Server (Server B).
RabbitMQ (Server B) is simply a RabbitMQ Queue Server. Nothing more, nothing less.
Laravel Installation (Server C) is our actual Laravel install, which is meant to look for jobs on specific queues and do things with them.
We have a RabbitMQ package in the Laravel install, which allows the use of the regular Laravel Queue mechanics over a RabbitMQ connection.
The issue I've come across is that we can spawn a worker for a queue - but since we're not generating the jobs passing a $job class (the job content itself is most often a JSON array), the Laravel install has no idea what to do with the job.
So my question revolves mainly around how to approach a scenario like this. I'm thinking that using the Queue-functionality in Laravel won't do what I need it to do. Can you see an approach that I'm missing? Do I really need to spawn a daemon on a non-framework script to handle this?
Your input is much appreciated!
An alternative approach would be a listener on your Laravel application consuming the JSON messages an acting on those.
A queue listener can be created using a package such as https://github.com/bschmitt/laravel-amqp (a generic AMQP bridge for Laravel) or https://github.com/needle-project/laravel-rabbitmq (a bridge more specialised for RabbitMQ).
The queue consumer then reads the JSON payload, saves the paymload as appropriate data, then decides what jobs to dispatch as a result within the Laravel application, as handled by the https://github.com/vyuldashev/laravel-queue-rabbitmq package.
The the two applications still communicate with plain JSON, and not the Laravel-oriented JSON containing the serialised job class.
The solution is indeed to replicate the job code onto the one issuing the job. The code will not need every dependency that the job requires to actually function, as it only serializes the job from the one pushing it.

Laravel queue external server

I have the following servers configured:
App Server is running LAMP with Laravel 5.1
Queue Server is running Beanstalkd and Supervisord
I want to be able to send jobs in Laravel on the App Server to the Queue Server which will simply run a DB insert. I was hoping to use Laravel's job queueing to do this but from my understanding it would then require me to have the same Laravel project on both hosts? It seems like it sends the class to execute, not the data itself. Perhaps I have a bad understanding of how Beanstalkd works? Should I be using something like RabbitMQ instead?
Any help would be greatly appreciated!

Laravel 4 and Beanstalkd

I now have a stable Beanstalkd and Laravel 4 Queue setup running on one machine. My question is, how can I install the Laravel 4 workers on a second machine and make them listen to my Beanstalkd? Maybe a very obvious question to some but I can't figure it out. I noticed there was a connection field in the php artisan queue:listen command. Do I have to use that?
how can I install the Laravel 4 workers on a second machine and make them listen to my Beanstalkd?
You'll need to have a working instance of your laravel application on the same server as the listener/workers.
This means deploying your application both to the web server and to server that is listening for jobs.
Then, on the listening server, you can call php artisan queue:listen in order to listen for new jobs and create a worker to handle the job.
I noticed there was a connection field in the php artisan queue:listen command. Do I have to use that?
On top of the above question, and similar to most artisan commands, you will likely also need to define which environment the queue:listen command should use:
$ php artisan queue:listen --env=production
In this way, your laravel app that is used to handle the workers (the app on the listening server) will know what configurations to use, including knowing what database credentials to use. This also likely means that both the web server and your job/listening server needs to have access to your database.
Lastly, you could also create 2 separate Laravel applications - One for your web application and one purely to handle processing job. Then they could each have their own configuration, and you'll have 2 (probably smaller?) code bases. But still, you'll have 2 code bases instead of 1.
In that regard, do whatever works best for your situation.

Categories