I am reading the documentation at Laravel under the heading Architecture Concepts.
I am unable to understand application and usage of Console Kernel .(not the Http Kernel)
However, I googled out and found these links
https://laravel.com/api/5.2/Illuminate/Foundation/Console/Kernel.html
https://laravel.com/api/5.3/Illuminate/Contracts/Console/Kernel.html
But I can't understand anything with that API !
The HTTP Kernel is used to process requests that come in through the web (HTTP). Website requests, AJAX, that kind of stuff.
The Console Kernel is used when you interact with your application from the command line. If you use artisan, or when a scheduled job is processed, or when a queued job is processed, all of these actions go through the Console Kernel.
Basically, if you go through index.php, you'll be using the HTTP Kernel. Most everything else will be using the Console Kernel.
Related
I am trying to implement async job in laravel, so I can send email (using 3rd party API), but let user go in the frontend so request doesn't wait for email to be sent.
I am using Laravel 6.18.
so I've created generic job with php artisan make:job EmailJob
I've set sleep for 60 seconds as a test of long email send.
then in my controller
EmailJob::dispatchAfterResponse();
return response()->json($obj,200);
In chrome console, I can see there is 200 response, however request is still no resolved, and there is no data returned, so my ajax/axios request still waits for full response, eventually it times out (60 seconds is too long), and produces error in frontend.
So question is, how to execute job after full response is sent ?
You have to change the queue driver and run queue:worker
The following 2 resources will help you
https://laravel-news.com/laravel-jobs-and-queues-101
https://laravel.com/docs/6.x/queues#connections-vs-queues
Just like in Terminable Middleware, this will only work if the Webserver has FastCGI implemented.
You can go that way, or you can do a Queue with Database driver, which is simpler to achieve than installing Redis.
You would still need to have a running process to complete the jobs. (worker)
I need to run some code every time Laravel receives some sort of request.
This may be an HTTP request, console command or queue event.
The code needs to access the database.
While Service Providers may be the appropriate in similar situations, I'm experiencing a bunch of problems running DB hitting code in them.
In earlier versions of Laravel it was possible to hook into App Lifecycle Events. This does not seem to possible anymore.
Middleware also only applies to HTTP requests.
Any ideas on how to approach this are appreciated.
I am trying to set up an API system that synchronously communicates with a number of workers in Laravel. I use Laravel 5.4 and, if possible, would like to use its functionality whenever possible without too many plugins.
What I had in mind are two servers. The first one with a Laravel instance – let’s call it APP – receiving and answering requests from and to a user. The second one runs different workers, each a Laravel instance. This is how I see the workflow:
APP receives a request from user
APP puts request on a queue
Workers look for jobs on the queue and eventually finds one.
Worker resolves job
Worker responses to APP OR APP finds out somehow that job is resolved
APP sends response to user
My first idea was to work with queues and beanstalkd. The problem is that this all seem to work asynchronously. Is there a way for the APP to wait for the result of one of the workers?
After some more research I stumbled upon Guzzle. Would this be a way to go?
EDIT: Some extra info on the project.
I am talking about a Restful API. E.g. a user sends a request in the form of "https://our.domain/article/1" and their API token in the header. What the user receives is a JSON formatted string like {"id":1,"name":"article_name",etc.}
The reason for using two sides is twofold. At one hand there is the use of different workers. On the other hand we want all the logic of the API as secure as possible. When a hack attack is made, only the APP side would be compromised.
Perhaps I am making things all to difficult with the queues and all that? If you have a better approach to meet the same ends, that would of course also help.
I know your question was how you could run this synchronously, I think that the problem that you are facing is that you are not able to update the first server after the worker is done. The way you could achieve this is with broadcasting.
I have done something similar with uploads in our application. We use a Redis queue but beanstalk will do the same job. On top of that we use pusher which the uses sockets that the user can subscribe to and it looks great.
User loads the web app, connecting to the pusher server
User uploads file (at this point you could show something to tell the user that the file is processing)
Worker sees that there is a file
Worker processes file
Worker triggers and event when done or on fail
This event is broadcasted to the pusher server
Since the user is listening to the pusher server the event is received via javascript
You can now show a popup or update the table with javascript (works even if the user has navigated away)
We used pusher for this but you could use redis, beanstalk and many other solutions to do this. Read about Event Broadcasting in the Laravel documentation.
I have a need for part of my application to make calls to Reddit asynchronously from my core application's workflow. I have implemented a semi-workable solution by using a Reddit API library I have built here. For those that are unaware, Reddit manages authentication via OAuth and returns a bearer and a token for a particular user that expires in 60 minutes after generation.
I have opted to use cookies to store this authorization information for the mentioned time period, as seen in the requestRedditToken() method here. If a cookie is not found (i.e. it has expired) when another request to Reddit needs to be made, another reddit token is generated. This seems like it would work just fine.
What I am having trouble with is conceptualizing how cookies are handled when integrated with a daemonized queue worker, furthermore, I need to understand why these calls are failing periodically.
The application I'm working with, as mentioned, makes calls to Reddit. These calls are created by a job class being handled: UpdateRedditLiveThreadJob, which you can see here.
These jobs are processed by a daemonized Artisan queue worker using Laravel Forge, you can see the details of the worker here. The queue driver in this case is Redis, and the workers are monitored by Supervisor.
Here is the intended workflow of my app:
An UpdateRedditLiveThreadJob is created and thrown into the queue to be handled.
The handle() method of the job is called.
A Reddit client is instantiated, and a reddit token is requested if a cookie doesn't exist.
My Reddit client successfully communicates with Reddit.
The Job is considered complete.
What is actually happening:
The job is created.
Handle is called.
Reddit client is instantiated, something odd happens here generally.
Reddit client tries to communicate, but gets a 401 response which produces an Exception. This is indicative of a failed authorization.
The task is considered 'failed' and loops back to step 2.
Here are my questions:
Why does this flow work for the first hour, and then collapse as described above, after presumably, the cookie has expired?
I've tried my best to understand how Laravel Queues work, but I fundamentally am having a hard time of conceptualizing the different types of queue management options available: queue:listen, queue:work, a daemonized queue:work running on Supervisor, etc. Is my current queue infrastructure compatible with using cookies to manage tokens?
What adjustments do I need to make to my codebase to make the app function as intended?
How will my workflow handle multiple users, who each potentially have multiple cookies?
Why does the workflow magically start working again if I restart my queue worker?
Please let me know if I'm incorrectly describing anything here or need clarification, I've tried my best to explain the problem succinctly.
Your logic is incorrect. A queue job is in fact a cli running php script. It has no interaction with a browser. Cookies are set in a browser, see this related thread for reference.
Seeing you're interacting with an API it would make more sense to set the token as a simple variable in the Job (or better yet in that wrapper) and then re-use that within that job.
TL:DR: your wrapper is not an API client.
I know this is not a complete answer to all your questions, but it's a push in the right direction. Because would I have answered all your questions - in the end - might not have given any solution to your issues ;)
I have a Zend Application that is running fine.
I have created a Zend Queue script in my library to run some emailing process to members of the site.
The application has many Models that are working well, but when I try and initiate the application in my queue script, it doesn't run.
The only reason I can see for this is a custom helper that extends Zend_Controller_Action_Helper_Redirector. This redirector checks if https is required.
Without initiating the application the run around I have to do to get my queue is nigh impossible.
In my script I am calling from Supervisrd, I am setting up my environment and $application->bootstrap()->run();
I then call the scripts class, but it does not venture past the ->run().
The redirector helper calls exit(), which is why it's not working. I'd rewrite the endpoint to remove the call to redirect, or write a different endpoint for consumption by the CLI job handled by supervisord.