I have a web app created in Laravel that takes credit card payments.
Every day a scheduled command that I created runs to take "today's" payments (basically it submits one http request for each pending payment to the payment gateway).
Now I need to allow to trigger this payment submission process via a button in a dashboard.
The command takes a random long time to process (depending on the number of payments to process), so call it from the controller I think is not an option.
I'm thinking of just refactor it: move all the command code to a "middleman" class so I could call this class on both the command and the controller.
PaymentsSubmissionHelper::submit()
PaymentsSubmissionCommand: PaymentsSubmissionHelper::submit()
PaymentsSubmissionController: PaymentsSubmissionHelper::submit()
However, the command shows a progress bar and the estimated time to process and I will need to show a progress bar in the html interface as well. In the web interface I will need to make ajax requests to the server to get the current progress but in the command this progress is tracked in a completely different way using:
$bar = $this->output->createProgressBar($totalPayments);
$bar->setFormat(' %current%/%max% [%bar%] %percent:3s%% %elapsed:6s%/%estimated:-6s% %message%');
and for each processed payment:
$bar->advance();
How can I create keep track of the progress on both the command and the controller?
Any help will be appreciated.
Thanks in advance!
As already pointed out in another answer, Laravel's queued event listeners are the way to handle long-running processes on the front end. You shouldn't need to refactor your console command at all.
As to showing progress on the front end, one simple solution would be to set up some AJAX polling. Ever few seconds have AJAX fire off a request to a controller method which simply looks at today's payments, calculates how many are processed (presumably you have some kind of status field which will show you whether or not the running job has handled it yet), and return a number representing the percentage that are done. The AJAX success handler would then update your progress tracker on the page.
// Check status every 2s
var timer = setInterval(function() {
pollStatus();
}, 2000);
pollStatus = function() {
$.ajax({
url: 'somewhere/jobStatus',
success: function(resp) {
$('#progress').html(resp . '%');
if (resp === 100) {
// We've reached 100%, no need to keep polling now
clearInterval(timer);
}
}
});
}
It might be wise to somehow make sure polls don't overrun, and maybe you'd want to tweak the frequency of polling.
I would suggest using queued event listeners in this use case. You would dispatch an event in your controller and have a listener which could trigger the command. By queueing the listener you avoid a long response time. No need to refactor the command itself!
Regarding a progress bar, you could have a static progess bar that updates on page load where you would read out the status from your DB and display it similarly to how Amazon displays how far along your order is at any moment.
For a real time updated progress bar, I suggest implementing web sockets. Socket.io seems great.
As you are using progress bar and advancing it, you will do same in ajax but the progress logic will be different off-course.
The common part in both the cases is handling each card payment. So I will say create separate class or service which takes card payment instance e.g. PaymentProcess, processes it and returns if successful or failed.
Then in command you can do (psuedocode) :
public function handle()
{
$pendingPayments = Payment::where('status', 'pending');
$bar = $this->output->createProgressBar($pendingPayments->count());
$pendingPayments->chunk(10, function($payments) use($bar){
$payments->each(function($payment) use ($bar){
$process = (new PaymentProcess($payment))->process();
$bar->advance();
});
});
$bar->finish();
}
Now if you trigger this from frontend, the ajax response should give you an id of current process stored somewhere. Then you will keep sending another ajx requests in an interval of lets say 1 second and get the current progress until it reaches to 100%. (If you are using XMLHttpRequest2 then the logic will differ)
For that you can create another table to store progresses and then keep updating it.
Now similarly you can use the PaymentProcess inside controller. :
public function processPendingPayments(Request $request)
{
// Authorize request
$this->authorize('processPendingPayments', Payment::class);
$pendingPayments = Payment::where('status', 'pending');
// Create a progress entry
$progress = PaymentProgress::create([
'reference' => str_random('6')
'total' => $pendingPayments->count(),
'completed' => 0
]);
$pendingPayments->chunk(10, function($payments) use($bar){
$payments->each(function($payment) use ($bar){
$process = (new PaymentProcess($payment))->process();
// Update a progress entry
$progress->update([
'completed' => $progress->completed + 1;
]);
});
});
return response()->json([
'progress_reference' => $progress->reference
], 200);
}
Now another endpoint to get the progress
public function getProgress(Request $request)
{
// Authorize request
$this->authorize('getProgress', Payment::class);
$request->validate([
'reference' => 'required|exists:payment_process,reference'
]);
$progress = PaymentProcess::where('reference', $request->reference)->first();
$percentage = $progress->completed / $progress->total * 100;
return response()->json(compact('percentage'), 200);
}
Related
What is the best way to keep checking a condition until it is true? I know there is a while statement in PHP but was not sure if there is a better packaged way in Laravel.
Basically I am transcoding a video through AWS. I want the frontend to keep saying "Uploading" until I know the video is transcoded and saved in AWS and all info is in the database. The videos will be short, but still transcoding is not instant so if I do:
if ($job['Status'] == 'complete') {
$submission = new Submission();
$submission->email = $request->input('email');
$submission->original = config('filesystems.disks.s3.url') . $original_key;
$submission->save();
return response()->json([
'submission' => $submission,
'message' => 'Upload Successful. Good luck!!!!',
'job' => $job
]);
}
This if statement will be false right away. But if I checked again every few seconds it will pass after a little bit. Is there a pulse type function to continue to run that if statement every X amount of time until it passes? Using a while seems to be hitting max_execution_time the limit.
Does this have to be on the PHP side?
If JS works, I suppose one way would be to have a bool state in your JS to show "Uploading" with setTimeout() to check if the video is uploaded every x seconds. When it passes you can set the state to false, which would make "Uploading" disappear.
Also, don't forget to stop the setTimeout function when it passes.
In my opinion, there could be 02 solutions:
1) If I were you, I'd prefer to handle it in Frontend side with AJAX. This is a pseudo code to explain my idea
function uploadMyVideo() { // to be called once we confirm the upload
var jqxhr = $.ajax( "/upload/url" )
.done(function() {
// call another ajax to set the status = SUCCESS
})
.fail(function() {
// call another ajax to set the status = FAIL
})
.always(function() {
// in case you need it
});
}
2) You can use Laravel scheduler to call periodically (eg: every minute) a job which checks for completed status, and run the logic inside it. (it's a cron job with laravel way)
I have this ajax request to update my db.
function ajax_submit(){
var submit_val=$("#stato").serialize();
dest="plan/new_bp1.php";
$.ajax({
type: "POST",
url: dest,
data: submit_val,
success: function(data){
data1=data.split("|");
if(data1[0]=="Successo"){
$("#spnmsg").fadeTo(200,0.1,
function(){$(this).removeClass().addClass("spn_success").html(data1[1]).fadeTo(900,1)});
}else if(data1[0]=="Errore"){
$("#spnmsg").fadeTo(200,0.1,
function(){$(this).removeClass().addClass("spn_error").html(data1[1]).fadeTo(900,1)});
}
},
complete: function(){
setTimeout(function(){ $('.container').load('plan/home.php');},2000);
}
});
}
The called script will take long to perform since it has to select, elaborate and insert around 4.000 recors each time. What I do now is to add a spinner on the screen to give users a feedback that the request is working (triggered by AjaxStart and AjaxStop).
At the end of the call the complete function will echo what the php script will echo.
What I'd like to do is to have a counter that will update during the script execution where I can say something like "X records out of 4.000 processed"
On the script side I have no problem to calculate the number of processed records and the number of record overall.
How can I update my script to show the progress?
You have a couple of options.
1) Right when the request starts you can start polling a different endpoint that just serves out the number of records that have been processed, like #adeneo suggested. You don't have to write to a file every time a record is processed. You can just store it in memory and ensure that the route handler you have to handle requests coming in for the progress has access to that same memory.
2) Implement a websocket endpoint on your server that pushes out the number of processed records. Basically, on the server then, you will call that Websocket library code to push out the progress. You will also have to create a Websocket connection on the Javascript side, which is trivial, and listen for those messages.
I'm willing to set up a long polling Ajax call to check for orders in my e-commerce web app. There is a specificity in this application in the way that customers are able to place order in the future. As such, in the admin panel, we have past orders and futures orders (that can be 2 months or 20 minutes in the future).
Basically, I want the admin user in the back-end to be warned as soon as a future order comes to an end (the future date reaches the current time). To proceed, I make the user admin doing an Ajax call (as soon as they are connected to the admin) to the server to check for futures orders to arrive. This Ajax call is a long polling request as the call waits for the server to deliver result. If server has nothing to offer, the request keeps pending until there is an order to show.
Ajax request
(function poll() {
setTimeout(function() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids) alert('New order!'); // I've simplified this part of the code to make it clean, admin are actually warned through Node.JS server
},
error: function() {},
complete: poll
});
}, 5000);
})();
{{ path('commande_check') }} (edited from Edit2)
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$response = new Response();
$em = $this->getDoctrine()->getManager();
$ids = array();
while(!$ids)
{
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
break;
else
time_sleep_until(time() + self::SECONDS_TO_SLEEP);
}
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
findNewestOrder() method
public function findNewestOrders(\DateTime $datetime)
{
$query = $this->createQueryBuilder('c')
->select('c.id')
->leftJoin('Kt\PaymentBundle\Entity\Paiement', 'p', \Doctrine\ORM\Query\Expr\Join::WITH, 'p.id = c.paiement')
->andWhere('p.etat = 0')
->where("DATE_FORMAT(c.date, '%Y-%m-%d %H:%i') = :date")
->setParameter('date', $datetime->format('Y-m-d H:i'))
->andWhere('c.kbis IS NULL')
->andWhere('c.notif = 0')
->getQuery();
return $query->getArrayResult();
}
My problem is the alert sometimes never get shown whereas the record in the DB gets updated. The weirdest things is it sometimes happens even when I've leaved the page making the Ajax call like if it keeps running in the background. I think the problem comes from the time_sleep_until() function. I tried with sleep(self::SECOND_TO_SLEEP) but the problem was the same.
Any help would by gladly appreciated. Thanks!
Edit 1
I sense there is something to do with connection_status() function as the while loop appears to continue even if the user has switched page causing the field notif to be updated in the background.
Edit 2
As per my answer, I've managed to overcome this situation but the problem still remains. The admin does get the notification properly. However, I do know the Ajax call still keeps going on as the request has been made.
My problem is now: could this result in a server resources overload?
I'm willing to start a bounty on this one as I'm eager to know the best solution to achieve what I want.
I think I got it all wrong.
The intent of long-polling Ajax is not that there is only one connection that stays opened such as websockets (as I thought it did). One would have to make several requests but much less than regular polling.
Regular polling
The intent for Ajax regular polling is one makes a request to the server every 2 or 3 seconds to have a semblance of real-time notification. These would result in many Ajax calls during one minute.
Long polling
As the server is waiting for new data to be passed on to the browser, one would need to make only a minimal number of requests per minute. As I'm checking in the database for new order every minute, using long polling can make me lower the number of requests per minute to 1.
In my case
In consequence, the specificity of the application makes the use of Ajax long-polling unnecessary. As soon as a MySQL query has been made for a specific minute, there is no need for the query to run again in the same minute. That means I can do regular polling with an interval of 60000 ms. There's also no need to use sleep() nor time_sleep_until().
Here's how I ended up doing it:
JS polling function
(function poll() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids)
alert('New orders');
},
error: function() {},
complete: function() {
setTimeout(poll, 60000);
}
});
})();
{{ path('commande_check') }}
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$em = $this->getDoctrine()->getManager();
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response = new Response();
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
As such, I end up with one request per minute that will check for new orders.
Thanks #SteveChilds, #CayceK and #KevinB for their kind advice.
In general for this problem it is kinda rough to say. We don't have a lot of information as to exactly what your other functions do. Like findNewestOrders...
We can assume that it pulls all new orders that have yet to be fulfilled by the admin and therefore will be displayed. However, if it is looking only for orders that are exactly equal they will never be filled.
Theoretically this will run forever if no new order ever is filed. You have no time limit on this so it is possible that the server feels like you have a case in which while will never be false and executes an exceeded execution time.
As per your comment
time_sleep_until
Returns TRUE on success or FALSE on failure.
The only way it would ever fail is if the function itself failed or some server side issue caused a failure return. As you never officially visit the page and no act of leaving your ajax'd page submits a failure response it should never really fail.
I think it might be more wise to look into doing a CRON job for this and have a database of incomplete orders that you query instead. The CRON can run every minute and populate the database. The run on the server would not be that great as it would most likely take no more than 30 seconds any way.
Long-polling may be a great idea for many functions, but I'm not wholly confident it is in the case. I would seriously recommend setInterval as the load on the server and client would not be that great in a 30 seconds call every minute or two. That is your call in the end.
I personally would check frequently rather than have one request which runs for a long time - its not really ideal to have long running processes like this as they tie up server connections and really, its just bad practice. Plus the browser may well time the connection out, which is why you may not be seeing the responses you expect.
Have you tried changing the ajax call so it calls in say, every 60 seconds (or however often you want), checks for new orders since the last time it was polled (simply keep a track of this in the page / HTML5 local storage so it persists across pages and pass it in the ajax request as a parameter) and then simply returns an indication of yes there have been new orders, or no there hasn't?
You can then display a message if there have been new orders.
I have finally managed to overcome this bug but without digging deeply in the problem.
I have separated the code that updates the notif field from the code that fetch new orders. In that way, the while loop still goes on but cannot update the field.
The field is therefore updated on success of the first ajax call by making a new ajax request to update the field. Therefore, the admin always receives the notification.
I just have to enquiry on a memory/thread level to see what consumption of resources this loop uses.
As no solution has been found despites my workaround for the initial bug, I won't accept my answer as the problem remains still.
Many thanks for all the help on that question.
I want to notify a person a count of the jobs available in a table in my database. In a table I have a list of 8 jobs and they have their avilability. I've done the count with a PHP query with SELECT COUNT(*) AS jobs... and created this ajax script which shows the count with an interval.
$(document).ready(function() {
$.ajaxSetup({ cache: false });
setInterval(function() {
$('#divToRefresh').load('notification.php'); //this contains the query
}, 30000);
});
However, I am not sure how I can make it so when the user sees the notifcation alert, they will close it and it doesn't appear again until there's a new available job.
I can't find anything on the good ol google either.
Your jQuery function is running in 30 seconds intervals polling data from server. notification.php returns pre-rendered HTML containing (I assume among other things) the number of jobs available.
Check the number of available jobs and show notification based on that:
setInterval(function() {
var oldNumberOfJobs = newNumberOfJobs = 0;
$('#divToRefresh').load('notification.php'); //this contains the query
newNumberOfJobs = $('#divWithJobsCount').text;
if (newNumberOfJobs > oldNumberOfJobs) {
// show notification to the user
}
oldNumberOfJobs = newNumberOfJobs;
}, 30000);
To be more specific (showing/hiding notifications), I need to see your HTML.
This solution feels bulky. The whole pre-rendered HTML is being reloaded again and again. A nicer approach would be to only return the number of jobs available and only update that number using jQuery.
Also, it might be a good idea to return the latest job_id as well as the total number of jobs available. This way, you could check if the latest job_id that is already stored in the front end matches the newly received job_id. And only if they don't match you would update the counter and show new notification.
update
Here is a jsfiddle.js which covers your case. Jobs counter is simulated with current number of minutes. The function checks every 15 seconds if the number of minutes has changed. Once it happens, an alert notification is shown (but only if the old one was closed).
setInterval in jsfiddle is written using a mock object so that it is testable in the browser without AJAX requests. In your code use the following form:
setInterval(function() {
$.get('notification.php', checkJobsCounter); // returns jobs count in plain text
}, 30000);
Well you need to make an AJAX call and the response of it should be the number of new notifications. Now check if num>0 then just do $("#notification").fadeIn(); which is a by default hidden div having text You have new notification (or whatever) and a close button.
$(doument).ready(function(){
setInterval(function(){
$.get("notification.php", function(data){
if(data>0)
{
$("#div").fadeIn();
}
});
},1000);
});
Once user closes this you can create a cookie in the browser of the user and then check if cookie is set dont make ajax request, This was you dont have to update the db and every user will be able to see the notification.
I have a search textbox where upon a keypress an AJAX call is made to return search results for the entered text. This results in an AJAX call being made for every single keypress.
For example if I type in airport:
I get 7 ajax requests each searching for a, ai, air, airp, airpo, airpor, airport respectively - however the problem is that they might all start one after the other but don't necessarily end in the same order so more often than not I receive results in the wrong order i.e I might have written airport and received the result for airport only to receive the result for airpo later on.
How do I handle this in jQuery here?
Update:
There is a timer delay of 3 seconds - but the issue is in ensuring that when one AJAX request is made, another Request when made cancels out the previous request and so forth.
How could I do this in code?
Fire the ajax call on a delay - use this in conjunction with the abort() code above:
var typeDelay = function(){
var timer = 0;
return function(callback, ms){
clearTimeout (timer);
timer = setTimeout(callback, ms);
}
}();
$("#searchField").keypress(function(){
typeDelay(function(){
// your ajax search here, with a 300 ms delay...
}, 300);
});
Sending a lookup request at every keystroke is generally a bad idea - I'd suggest instead that you send at short intervals (ie. send the textbox value every 3 seconds or so).
Further, to prevent the asynchronous returns, you could have a flag to keep track of a request's state, so that only 1 can be active at any one time (eg. set on send, cleared on return). This would effectively render your lookups synchronous.
EDIT: Added sample code
var req = null;
function sendRequest (text)
{
// Check for pending request & cancel it
if (req) req.abort ();
req = $.ajax ({
// various options..
success: function (data)
{
// Process data
...
// reset request
req = null;
}
});
}
You can cancel AJAX requests:
var x = $.ajax({
...
});
x.abort()
So if you want to send a new request before the previous one has returned, abort() the first one.
Build an "ajax queue" in jquery... use it to ensure your requests go in order. The below address has a great example near the bottom:
AJAX Streamlining techniques?
Extend the functionality of these queue, to handle Delay and Abort mechanics like the answers above.
If another request is submitted to the queue before the delay is over, abort the queued request.
Suggestion: add the functionality of delay and abort mechanics properly to the queue so that the values can be requested when you post a request to the queue, like params, that way your queue plugin remains reusable outside this need.