Run database-hitting code on every boot in Laravel 5.5 - php

I need to run some code every time Laravel receives some sort of request.
This may be an HTTP request, console command or queue event.
The code needs to access the database.
While Service Providers may be the appropriate in similar situations, I'm experiencing a bunch of problems running DB hitting code in them.
In earlier versions of Laravel it was possible to hook into App Lifecycle Events. This does not seem to possible anymore.
Middleware also only applies to HTTP requests.
Any ideas on how to approach this are appreciated.

Related

How can I make asynchronous request with Laravel?

In my Laravel 5.4 web app user can request report generation that takes a couple of minutes due to a big amount of data. Because of these he couldn't work with application no more, until report will be generated. To fix this problem I have read about queues in laravel and separated out my report generation code to the job class, but my app still holds until report will be generated. How can I fix that?
To be absolutely clear I will sum up my problem:
User make request for report generation (my app absolutely holds at this moment)
My app receives POST request in routes and calls a function from the controller class.
Controller's function dispatches a job, that should generate report and put it into the client web folder.
It sounds like you have already pretty much solved the problem by introducing a queue. Put the job in the queue, but don't keep track of its progress - allow your code to continue and return to the user. It should be possible to "fire-and-forget", and then either ask the user to check if the report is ready in a couple of minutes, or offer the ability to email it to them when it is completed.
By default, Laravel uses the sync queue driver. This driver executes the queued jobs in the same request as the one they are created in. So this won't make any difference.
You should take a look at other drivers and use the Laravel queue worker background process to execute jobs to make sure they don't hold the webrequest from completing.

Laravel Queue Worker, RabbitMQ and running jobs generated remotely

I'll preface this by admitting slight sleep-deprivation.
The setup is as follows:
API Endpoint (Server A) receives an incoming call, and adds this to a specific queue on the RabbitMQ Server (Server B).
RabbitMQ (Server B) is simply a RabbitMQ Queue Server. Nothing more, nothing less.
Laravel Installation (Server C) is our actual Laravel install, which is meant to look for jobs on specific queues and do things with them.
We have a RabbitMQ package in the Laravel install, which allows the use of the regular Laravel Queue mechanics over a RabbitMQ connection.
The issue I've come across is that we can spawn a worker for a queue - but since we're not generating the jobs passing a $job class (the job content itself is most often a JSON array), the Laravel install has no idea what to do with the job.
So my question revolves mainly around how to approach a scenario like this. I'm thinking that using the Queue-functionality in Laravel won't do what I need it to do. Can you see an approach that I'm missing? Do I really need to spawn a daemon on a non-framework script to handle this?
Your input is much appreciated!
An alternative approach would be a listener on your Laravel application consuming the JSON messages an acting on those.
A queue listener can be created using a package such as https://github.com/bschmitt/laravel-amqp (a generic AMQP bridge for Laravel) or https://github.com/needle-project/laravel-rabbitmq (a bridge more specialised for RabbitMQ).
The queue consumer then reads the JSON payload, saves the paymload as appropriate data, then decides what jobs to dispatch as a result within the Laravel application, as handled by the https://github.com/vyuldashev/laravel-queue-rabbitmq package.
The the two applications still communicate with plain JSON, and not the Laravel-oriented JSON containing the serialised job class.
The solution is indeed to replicate the job code onto the one issuing the job. The code will not need every dependency that the job requires to actually function, as it only serializes the job from the one pushing it.

Symfony/GuzzleHttp: Limiting API calls across multiple consumers/instances

I've been working on a project for a while that fetches data from an API and processes that data locally for various uses. Currently, a consumer picks up JSON objects from the message queue that it uses to trigger a matching Symfony command. The rate limiting is built in to this one consumer, is fairly simple and adjusts itself automatically to status responses from the API. The problem is, the way it is set up, it cannot run in parallel and if there is a major update to the versioned static data on the API, all processing halts while it caches the new static data.
I looked at using the rabbitmq-bundle Symfony bundle and converting the commands into separate consumers with their own channels so that they can be run in parallel and no longer block each other, however this comes with a couple of issues I'm stuck with how to handle.
The first is that I still need to manage limiting the API calls across all the consumers. I have a wrapper for Guzzle that could, in theory, use a simple file to manage to number of calls across all instances of it. I looked at an existing token bucket library but setting it up to work in Symfony looks problematic as each consumer could potentially reset the number of tokens if the consumer is restarted, so... Not sure where to go with that.
The second is that some consumers may hit data from the main API that we're still do not have the matching version of the static data for. If this happens, it needs to trigger the related consumers but only if there isn't already a trigger in each queue... Possible solution I can see for this is record the latest requested version in a file at the time a message is published to update it and have the consumer wait for the data to be available locally. Again, kind of lost about how best to handle this.

Is it possible to run queues working synchronously with Laravel

I am trying to set up an API system that synchronously communicates with a number of workers in Laravel. I use Laravel 5.4 and, if possible, would like to use its functionality whenever possible without too many plugins.
What I had in mind are two servers. The first one with a Laravel instance – let’s call it APP – receiving and answering requests from and to a user. The second one runs different workers, each a Laravel instance. This is how I see the workflow:
APP receives a request from user
APP puts request on a queue
Workers look for jobs on the queue and eventually finds one.
Worker resolves job
Worker responses to APP OR APP finds out somehow that job is resolved
APP sends response to user
My first idea was to work with queues and beanstalkd. The problem is that this all seem to work asynchronously. Is there a way for the APP to wait for the result of one of the workers?
After some more research I stumbled upon Guzzle. Would this be a way to go?
EDIT: Some extra info on the project.
I am talking about a Restful API. E.g. a user sends a request in the form of "https://our.domain/article/1" and their API token in the header. What the user receives is a JSON formatted string like {"id":1,"name":"article_name",etc.}
The reason for using two sides is twofold. At one hand there is the use of different workers. On the other hand we want all the logic of the API as secure as possible. When a hack attack is made, only the APP side would be compromised.
Perhaps I am making things all to difficult with the queues and all that? If you have a better approach to meet the same ends, that would of course also help.
I know your question was how you could run this synchronously, I think that the problem that you are facing is that you are not able to update the first server after the worker is done. The way you could achieve this is with broadcasting.
I have done something similar with uploads in our application. We use a Redis queue but beanstalk will do the same job. On top of that we use pusher which the uses sockets that the user can subscribe to and it looks great.
User loads the web app, connecting to the pusher server
User uploads file (at this point you could show something to tell the user that the file is processing)
Worker sees that there is a file
Worker processes file
Worker triggers and event when done or on fail
This event is broadcasted to the pusher server
Since the user is listening to the pusher server the event is received via javascript
You can now show a popup or update the table with javascript (works even if the user has navigated away)
We used pusher for this but you could use redis, beanstalk and many other solutions to do this. Read about Event Broadcasting in the Laravel documentation.

How can I use & set cookies whilst inside a Laravel queued Job, and why is my current solution failing?

I have a need for part of my application to make calls to Reddit asynchronously from my core application's workflow. I have implemented a semi-workable solution by using a Reddit API library I have built here. For those that are unaware, Reddit manages authentication via OAuth and returns a bearer and a token for a particular user that expires in 60 minutes after generation.
I have opted to use cookies to store this authorization information for the mentioned time period, as seen in the requestRedditToken() method here. If a cookie is not found (i.e. it has expired) when another request to Reddit needs to be made, another reddit token is generated. This seems like it would work just fine.
What I am having trouble with is conceptualizing how cookies are handled when integrated with a daemonized queue worker, furthermore, I need to understand why these calls are failing periodically.
The application I'm working with, as mentioned, makes calls to Reddit. These calls are created by a job class being handled: UpdateRedditLiveThreadJob, which you can see here.
These jobs are processed by a daemonized Artisan queue worker using Laravel Forge, you can see the details of the worker here. The queue driver in this case is Redis, and the workers are monitored by Supervisor.
Here is the intended workflow of my app:
An UpdateRedditLiveThreadJob is created and thrown into the queue to be handled.
The handle() method of the job is called.
A Reddit client is instantiated, and a reddit token is requested if a cookie doesn't exist.
My Reddit client successfully communicates with Reddit.
The Job is considered complete.
What is actually happening:
The job is created.
Handle is called.
Reddit client is instantiated, something odd happens here generally.
Reddit client tries to communicate, but gets a 401 response which produces an Exception. This is indicative of a failed authorization.
The task is considered 'failed' and loops back to step 2.
Here are my questions:
Why does this flow work for the first hour, and then collapse as described above, after presumably, the cookie has expired?
I've tried my best to understand how Laravel Queues work, but I fundamentally am having a hard time of conceptualizing the different types of queue management options available: queue:listen, queue:work, a daemonized queue:work running on Supervisor, etc. Is my current queue infrastructure compatible with using cookies to manage tokens?
What adjustments do I need to make to my codebase to make the app function as intended?
How will my workflow handle multiple users, who each potentially have multiple cookies?
Why does the workflow magically start working again if I restart my queue worker?
Please let me know if I'm incorrectly describing anything here or need clarification, I've tried my best to explain the problem succinctly.
Your logic is incorrect. A queue job is in fact a cli running php script. It has no interaction with a browser. Cookies are set in a browser, see this related thread for reference.
Seeing you're interacting with an API it would make more sense to set the token as a simple variable in the Job (or better yet in that wrapper) and then re-use that within that job.
TL:DR: your wrapper is not an API client.
I know this is not a complete answer to all your questions, but it's a push in the right direction. Because would I have answered all your questions - in the end - might not have given any solution to your issues ;)

Categories