I need to run a specific script/application once a certain button was pressed in my view. Now that this process may take up some time (30 seconds and more) I'd like to inform the user of the progress and whether it is successful or not.
What I have in mind is a simple text/label and a progress bar which are
'somehow' fed with the progression data provided by my Laravel application (the controller, to be more precise).
However, I have no clue where to start or what is best practise for a case as such.
I suggest you take a look at the queue in laravel: https://laravel.com/docs/5.5/queues
For the progress part. On different parts in a job you could dispatch an event. If you want to display them in your frontend, you should use something with push notifications. Laravel also has something for that:https://laravel.com/docs/5.5/broadcasting
Related
I've been looking through Laravel's queue and schedulers and I'm not sure if it's what I need to do what I want. I'll try explain simply.
First, I hit a submit (basic form) that creates a db row with say a number and a created_at and finished_at. JS then creates a timer on the page that counts down (math) from the created at to the finished time.
I can do all that fine, what I'm struggling to get my head around is how do I make this then change that number value of 0 to 1 say after 10 minutes, or whatever time I wanted to specify? I'm not sure how to go about this. This sort of stuff is new to me.
Any help/pointing me in the right direction would be great! Explanations too. :)
Edit: To add, I looked at things like socket.io but I'm not sure if that's what I want too and if I can even use that with laravel as it's a framework off of node.js
Is the backend actually doing any work besides storage? If not I would skip the complexity of trying to open a web socket to tell the frontend the task is finished.
If all you're trying to accomplish is display something on the frontend to the user after an interval of time I would just use JS 100%.
However if the backend does need to do work and trigger the display change on the frontend you will need to open up a web socket.
Your constraint on Laravel's scheduling is that they are run at predefined times. So if these actions are triggered by the user and not a set time, skip using the scheduler. Instead use Events to broadcast something to node.
Hi there I'd like to make a recurring event based on calendar basis. The thing I want to ask is this: How to trigger the event? If the client doesn't login or open the webapp to trigger the event in the constructor, how do I make the code to run anyways.
I know cron jobs maybe the solution, but is there any other solution?(I want to avoid cron jobs).
You have to have something running in order for this to work; a cron is ideal for this, as it can run a script at regular intervals to check for things like events.
Without that, you are left by this being triggered by user action, but like you said, that user might not always log in to trigger it.
You could modify your code to have any user trigger actions for all users. It's a little more complex, as you'd need to allow for multiple users logging in at the same time when you only want one of them to trigger stuff. I would set up a table in your database to keep track of the last time things were triggered, and then if the gap since then and now is large enough when a user logs in, add a new entry in the table with the time now to prevent another user triggering this, and trigger the bits you need to run.
Like I said though, this won't be a perfect solution, and I really would recommend using cron. Is there any reason why you want to avoid using cron?
If that job should be triggered to keep the server in a good state, it should be run using a cron job, if its a database management, the database manager have the event's table to make cron jobs, if it's needed for only for the user, it can be a workload safe if you just don't do it until user login. You can manually login to trigger the event too, to an admin page maybe.
I suspect this question will seem too... silly, but I'm trying to get my head around a nice solution and I'm kinda stuck.
So, here's my situation :
I'm using Ajax to perform a series of tasks. Actually, I'm queuing (or parallely at times, doesn't really matter to me) 1 or more requests.
Show the progress as a percentage (1 out of X tasks performed)
When finished, show the final result.
What I'm trying to do :
Instead of having 3-4 different tasks running (= 3-4 different PHP scripts called asynchronously via Ajax), I would like to have just 1 (= 1 script) - in other words, combine the X scripts into one. (That's easy).
Issues I'm facing :
How could I still report the percentage complete (1 out of X tasks)?
Any ideas?
I would have updated a key in Memcached each time a task is complete. And then you would let your ajax-call simply get the value from your memcached key.
http://php.net/manual/en/book.memcached.php
If you have the request dropped into a database by your original ajax, then kick off the script, you will still have time (assuming the request takes some time to complete) for subsequent tasks to be dropped into the same database to be picked up by the script that is still running.
As for reporting it, perhaps run a quick jquery to see how many items are in the queue? Alternately have the task.php file update the database (possibly another table even) to say how many jobs it has completed and how many are currently still in the queue.
If you don't need to send much data to the PHP script, you can use a "long poll" approach. With this, you don't use AJAX but insert a script tag like this:
<script src="my_php_script?task1=x¶m_t1_1=42&task2=y"></script>
The PHP file can then send back a JavaScript command like
updatePercent(12);
after each task is done. The commands should be executed by the browser whenever they come in. Be sure to call flush after every task.
Looking into Comet may give you other ideas on how to handle the Client-Server connection.
You can manage queue - before sending AJAX request - put the task in queue (could be object or array). Run AJAX asynchronously with complete function which will remove a job from queue when it's done.
You can update progress together with removing job from queue or handle it separately using setTimeout() which will check how many task there are in queue and how many were put in it in total: % = (submitted_tasks - items_in_queue) / submitted_tasks * 100
I am putting together an interface for our employees to upload a list of products for which they need industry stat's (currently doing 'em manually one at a time).
Each product will then be served up to our stat's engine via a webservice api.
I will be replying. The Stat's-engine will be requesting the "next victim" from my api.
Each list the users upload will have between 50 and 1000 products, and will be its own queue.
For now, Queues/Lists will likely be added (& removed via completion) aprox 10-20 times per day.
If successful, traffic will probably rev up after a few months to something like 700-900 lists per day.
We're just planning to go with a simple round-robin approach to direct the traffic evenly across queues.
The multiplexer would grab the top item off of List A, then List B, then List C and so on until looping back around to List A again ... keeping in mind that lists/queues can be added/removed at any time.
The issue I'm facing is just conceptualizing the management of this.
I thought about storing each queue as a flat file and managing the rotation via relational DB (MySQL). Thought about doing it the reverse. Thought about going either completely flat-file or completely relational DB ... bottom line, I'm flexible.
Regardless, my brain is just vapor locking when I try to statelessly meld a variable list of participants with a circular rotation (I just got back from a quick holiday, and I don't think my brain's made it home yet ;)
Has anyone done something like this?
How did you handle it?
What would you improve if you had to do it again?
Any & all tips/suggestions/advice are welcome.
NOTE: Since each request from our stat's engine/tool will be separated by many seconds, if not a couple minutes, I need to keep this stateless.
List data should be stored in a database, for sure. Your PHP side should have a view giving the status of the system, and the form to add lists.
Since each request becomes its own queue, and all the request-queues are considered equal in priority, the ideal number of tables is probably three. One to list requests and their priority relative to another (to determine who goes next in the round-robin) and processing status, another to list the contents (list-items) of each request that are yet to be processed, and a third table to list the processed items from each queue.
You will also need a script that does the actual processing, that is not driven by a user request, but instead by a system-scheduled job that executes periodically (throttled to whatever you desire). This can of course also be in PHP. This is where you would set up your 10-at-a-time list checks and updates.
The processing would be something like:
Select the next set of at most 10 items from the highest-priority queue.
Process them, Updating their DB status as they complete.
Update the priority of the above queue so that it is now the lowest priority.
And if new queues are added, they would be added with lowest priority.
Priority could be represented with an integer.
Your users would need to wait patiently for their list to be processed and then view or download the result. You might setup an auto-refresh script for this on your view page.
It sounds like you're trying to implement something that Gearman already does very well. For each upload / request, you can simply send off a job to the Gearman server to be queued.
Gearman can be configured to be persistent (just in case things go to hell), which should eliminate the need for you logging requests in a relational database.
Then, you can start as many workers as you'd like. I know you suggest running all jobs serially, which you can still do, but you can also parallelize the work, so that your user isn't sitting around quite as long as they would've been if all jobs had been processed in a serial fashion.
After a good nights sleep, I now have my wits about me (I hope :).
A simple solution is a flat file for the priorities.
Have a text file simply with one List/Queue ID on each line.
Feed from one end of the list, and add to the other ... simple.
Criticisms are welcome ;o)
Thanks #Trylobot and #Chris_Henry for the feedback.
I am trying to create a PBBG (persistent browser based game) like that of OGame, Space4k, and others.
My problem is with the always-updating resource collection and with the building times, as in a time is set when the building, ship, research, and etc completes building and updates the user's profile even if the user is offline. What and/or where should I learn to make this? Should it be a constantly running script in the background
Note that I wish to only use PHP, HTML, CSS, JavaScript, and Mysql but will learn something new if needed.
Cron jobs or the Windows equivalent seem to be the way, but it doesn't seem right or best to me.c
Do you have to query your db for many users properties, like "show me all users who already have a ship of the galaxy class"?
If you do not need this you could just check the build queue if someone requests the profile.
If this is not an option you could add an "finished_at" column to you database and include "WHERE finished_at>= SYSDATE()" in your query. In that case all resources (future and present) are in the same table.
Always keep in mind: what use is there to having "live" data if no one is requesting it?
My problem is with the always-updating
resource collection and with the
building times, as in a time is set
when the building, ship, research, and
etc completes building and updates the
user's profile even if the user is
offline
I think the best way to do this is to install message queue(But you need to be have install/compile it) like beanstalkd to do offline processing. Let's say it takes 30 seconds to build a ship. With pheanstalk client(I like pheanstalk) you first put message to the queue using:
$pheanstalk->put($data, $pri, $delay, $ttr);
You could see protocol for meaning of all arguments.
But with $delay=30. When a worker process does a reserve() it can process the message after 30 seconds.
$job = $pheanstalk->reserve();
Streaming data to user in real-time
Also you could look into XMPP over BOSH to stream the new data to all users in real-time.
http://www.ibm.com/developerworks/xml/tutorials/x-realtimeXMPPtut/index.html
http://abhinavsingh.com/blog/2010/08/php-code-setup-and-demo-of-jaxl-boshchat-application/