Make notifications response when the job is completed - php

I am using Symfony framework3 with Pheanstalk bundle and Doctrine. I creating the event which sends data to beanstalk. The other SF app on the different server perform a job and update notification status on the first SF app to completed. How can I check when the status is updated and than set alert like that:
http://byrobin.nl/store/wp-content/uploads/sites/4/2016/03/local.png
I can create a command that have infinitive loop and checking for status update, maybe listener on preUpdate? Also I have the same problem with running command that checking and executing beanstalk jobs. In dev mode i run it by hand, but i try infinitive loop like while(true) but it load my buffer and crash. I was thinking of cron job that runs every minute or two? What is best solution for this two problems? Any advice?

1) It would be good with WebSockets as that doesn't involve while(true) loop. A websocket can be opened by the frontend after a task has been submitted for processing. After the job finished processing it would notify the server side of the websocket to relay the info back on the socket for the frontend.
2) Another option is to submit a message, and in the params name anonymous tube (make a unique name based on time and some prefix) where the worker needs to put the answer. And before submitting the job you subscribe on beanstalkd to the anonymous channel, then submit the job, and the job finishes it will post the answer to the tube. Since there is already a subscriber there it will reserve the job and deal with it, then delete it, and the tube gets removed too.

Related

How to properly use AWS SQS

I was looking for a good way to manage a lot of background tasks, and i found out AWS SQS.
My software is coded in PHP. To complete a background task, the worker must be a CLI PHP application.
How am i thinking of acclompishing this with AWS SQS:
Client creates a message (message = task)
Message Added to Mysql DB
A Cron Job checks mysql db for messages and adds them to SQS queue
SQS Queue Daemon listents to queue for messages and sends HTTP POST requests to worker when a message is received
Worker receives POST request and forks a php shell_execute with parameters to do the work
Its neccessary to insert messages in MySQL because they are scheduled to be completed at a certain time
A little over complicated.
I need to know what is the best way to do this.
I would use AWS Lambda, with an SQS trigger to asynchronoulsy process messages dropped in the queue.
First, your application can post messages directly to SQS, there is no need to first insert the message in MySQL and have a separate daemon to feed the queue.
Secondly, you can write an AWS Lambda function in PHP, check https://aws.amazon.com/blogs/apn/aws-lambda-custom-runtime-for-php-a-practical-example/
Thirdly, I would wire the Lambda function to the queue, following this documentation : https://aws.amazon.com/blogs/apn/aws-lambda-custom-runtime-for-php-a-practical-example/
This will simplify your architecture (less moving parts, less code) and make it more scalable.

How would I run a queue of an Amazon SQS ? Is it a cron job?

I understand the basic of Amazon SQS. Yet i'm still confused on how it runs? Is it an infinite running function that polls messages and deal with it? how would I achieve that in php?
What I have in mind is a cron job that triggers the polling and process the messages. is my understanding right?
There's more than one answer to this.
Yes, you could have cron poll regularly for new queue items. You could have a daemon running indefinitely (likely monitored by something like supervisor) that continues to poll in a loop.
There's also SQS triggers, where a new SQS item can automatically initiate something. There are multiple options available: new queue items can make an SNS notification, which could trigger a HTTP POST to a URL. They can also trigger a Lambda function.

How to batch read sqs messages inside beanstalk worker

Our web application running on elastic beanstalk logs activity of incoming request to a database. We want to decouple the dB logging from the request processing path, so that response time can be sped up. We decided to use sqs queues and beanstalk worker. The idea is to queue the logging event to sqs, and have the worker receive the events and let it do the logging to the dB.
Now the need is to optimize the dB logging operation and avoid creating one connection per message in the queue. From my understanding the sqs daemon would Call the worker for each message, is there a way to have the daemon send messages in a batch, so that there's only one message and it's body has contents of all messages?
Or do we need to use a secondary queue or write a custom sqs message aggregator that processes n messages from the queue and then sends one batch message to another queue and that then gets written to the dB once?
We are using php and mysql
From my experience, defaultly you cannot. The daemon calls your application for each message.
What you can do might be that you cache the messages locally (assuming you are using single instance instead of auto scaling one) in a file (locking system for multi-processing) and then uses the scheduling of ELB cronjob to retrieve information from the file then do your DB operations every a certain amount of time. Thus, you can do that DB operation in a batch.
If you want to use auto scaling with multiple instances, you might need to use another messaging which is a waste compared with another option. This option is you write your own code using based off aws sdk to receive/delete from SQS in a batch and then update your database.

Laravel background jobs best practice with Amazon MWS

I'm working with external feeding API "Amazon mws" which I get all products for specific seller. Now let me say if I want to refresh these products by two methods: Automatically and manualy, the automatic approach would be refreshing this store every 12 hours for example, and the manual approach is to let the seller manually click a refresh link and further to display progress bar until this job is done.
So, now how can I manage to make these two methods ? I'm totally confused between jobs, queues and task scheduling, whether to use beanstalkd or redis ?
I just want somebody to direct me how to manage all of that and best practice for this situation... Thanks Artisans :)
I believe it is not possible to obtain inventory information just for one SKU from MWS Api. When we had similar requirement, we just created a php script that connects to MWS Reports api specifically used _GET_MERCHANT_LISTINGS_ALL_DATA_ report to download the report and insert/update into mysql database. We did not use redis or message queuing because, MWS Reports api works in such a way that you request for report and poll the report processing status. when it is success, download the report and process into database. we have been running this php script with cron every 30 mins.
For the automatic refresh you can run a task scheduler (system to system) the user is not involved, it is a perfect case for scheduling a task.
Whereas the refresh button, would be a job, but take this into account that a job can be queued or not queued, by either implementing shouldQueue or not. If you will like this job to be done in the background, you can queue that job to be done async.
Then set up an event that fires when the job has completed, or when the database is updated , and you can broadcast a notification to the user informing him her or team that the update has been completed.
So lets take it step by step, you can make jobs with artisan command this job you can dispatch from your controller.
Write your business logic in the job and implement shouldQueue. Job does not need a return statement. Then create the queue with artisan command this will create the queue table, and change queue driver in env to database (you can get quite a long way with database queue so you dont have to use beanstalkd, and it is good way of practice queues, you should then queue:listen ! Just a note when U use queue listen the listen will keep running untill you close the terminal, then when opening a terminal before running listen queue:restart.
Create the event you want with artisan and on your model listen for the event updated, when the updated is complete the event will fire.
Create the notification with artisan command and on the event listener, event notify. the notification you can customize what you want to notify.
You will need to broadcast this notification and for that you will need to create an account with pusher, and broadcast the event.
The laravel documentation covers it all but it is difficult to know where to start.
To broadcast with pusher install pusher and laravel echo then in jour event you broadcast to and a channel on your web routes will be created channel, there are some other settings and configs, just a tip to thest the broadcast and to receive something back on your front end just to test. broadcast to channel not private channel just a bit easier setting things up from there if it works do whatever you want.
Hope it helps.
#gustav1105 from laracasts

Is it possible to run queues working synchronously with Laravel

I am trying to set up an API system that synchronously communicates with a number of workers in Laravel. I use Laravel 5.4 and, if possible, would like to use its functionality whenever possible without too many plugins.
What I had in mind are two servers. The first one with a Laravel instance – let’s call it APP – receiving and answering requests from and to a user. The second one runs different workers, each a Laravel instance. This is how I see the workflow:
APP receives a request from user
APP puts request on a queue
Workers look for jobs on the queue and eventually finds one.
Worker resolves job
Worker responses to APP OR APP finds out somehow that job is resolved
APP sends response to user
My first idea was to work with queues and beanstalkd. The problem is that this all seem to work asynchronously. Is there a way for the APP to wait for the result of one of the workers?
After some more research I stumbled upon Guzzle. Would this be a way to go?
EDIT: Some extra info on the project.
I am talking about a Restful API. E.g. a user sends a request in the form of "https://our.domain/article/1" and their API token in the header. What the user receives is a JSON formatted string like {"id":1,"name":"article_name",etc.}
The reason for using two sides is twofold. At one hand there is the use of different workers. On the other hand we want all the logic of the API as secure as possible. When a hack attack is made, only the APP side would be compromised.
Perhaps I am making things all to difficult with the queues and all that? If you have a better approach to meet the same ends, that would of course also help.
I know your question was how you could run this synchronously, I think that the problem that you are facing is that you are not able to update the first server after the worker is done. The way you could achieve this is with broadcasting.
I have done something similar with uploads in our application. We use a Redis queue but beanstalk will do the same job. On top of that we use pusher which the uses sockets that the user can subscribe to and it looks great.
User loads the web app, connecting to the pusher server
User uploads file (at this point you could show something to tell the user that the file is processing)
Worker sees that there is a file
Worker processes file
Worker triggers and event when done or on fail
This event is broadcasted to the pusher server
Since the user is listening to the pusher server the event is received via javascript
You can now show a popup or update the table with javascript (works even if the user has navigated away)
We used pusher for this but you could use redis, beanstalk and many other solutions to do this. Read about Event Broadcasting in the Laravel documentation.

Categories