Getting XMLHttpRequest Progress from PHP Script - php

I am using javascript to run a XMLHttpRequest to a PHP script which returns data. Basically I want to be able to provide the user with a progress bar (instead of a spinning circle or something) that shows the progress of getting and receiving the data. I know if I was getting a file, I could just check the content length header and use that, but in the case of a script, you don't know how much data it's retrieving.
The answer to this might be as easy as, "it's not possible," because right now it seems that way. But in conclusion: How do you monitor progress of a running script (php) over and XMLHttpRequest?

If you're using FireFox (and I'm fairly sure most other browsers other than IE), then there is indeed a way to report how much data has been transferred during an XHR operation. If the operation in question sends the correct header, it's fairly easy to use this information to calculate the percentage of the data downloaded.
I wrote this code for determining the percentage of data transferred in an XHR operation years ago, so I apologize for it not reflecting the years of coding experience I've gained since. I almost certainly wouldn't write it this way now! Still, I managed to fish it out, and hope it's of use for you.
At the time this was written, IE7 was the latest version of Explorer available, and I remember the code didn't work in that, hence it contains code to prevent it initializing under IE. I've never tried this code out under version 8 or the beta of version 9, and it may indeed work in those versions as well, but I can't vouch for it. If you can get it working in a new version of IE, please let me know!
It works by running code in beforeSend (a callback jQuery provides for code you want to run before starting an ajax request) to set up a Javascript interval (in the code I've put 50 miliseconds, which is probably far too often. 200 miliseconds should still be plenty, and put less strain on the system). Every time the interval timer fires, it runs a function that looks at the responseText attribute of the XHR request. The responseText attribute holds the raw text of the data received thus far. By counting how many characters are in there with the length() string method, we can work out how many bytes have been collected so far.
As far as working out the percentage of total data to be sent, this will require that your server side code sends a content-length header with an accurate count of how many bytes it is going to send. This will require a little cleverness on your part, but shouldn't prove too difficult. If you send an accurate content-length header, then it is used to calculate a percentage of data received so far. If you don't set a content header, then the amount of data received so far is displayed instead.
<script type="text/javascript">
$.ajax ({
beforeSend : function (thisXHR)
{
// IE doesn't support responseText access in interactive mode
if (!$.browser.msie)
{
myTrigger = setInterval (function ()
{
if (thisXHR.readyState > 2)
// When there is partial data available use it to determine how much of the document is downloaded
{
var dlBytes = thisXHR.responseText.length;
if (totalBytes == -1)
totalBytes = thisXHR.getResponseHeader ('Content-length');
(totalBytes > 0)?
$('#progress').html (Math.round ((dlBytes / totalBytes) * 100) + "%"):
$('#progress').html (Math.round (dlBytes / 1024) + "K");
}
}, 50); // Check the status every 50 miliseconds
}
},
complete : function ()
{
// Kill the download progress polling timer
if (myTrigger)
clearInterval (myTrigger);
}
});
</script>

Off the top of my head you could use 2 ajax requests. One to start and wait for the job to complete, and another to check on the job progress. I'm pretty sure most browsers can do at least 2 ajax requests at a time.
The PHP script (Lets call it job.php) that's actually doing the job can update the session variable $_SESSION['job_progress'] with the percentage the job is complete.
You have another PHP script (Lets call it progress.php) that echos that value, i.e.
<?php echo $_SESSION['job_progress'];
Client side you fire off your ajax request to job.php. You have another ajax request to progress.php that runs every 3 seconds. You update your progress bar with the value returned.
You could also do this with one ajax request if the request to job.php returns before the job is finished. Then you can keep using a single ajax request to ping the progress.php script.

In some browsers (firefox for one), onreadystatechange with readyState 3 (i.e. loading) is invoked mutiple times so download progress could be monitored.
Also, in some browsers the responseText property contains the result returns so far, and it could be examined to get some idea of the progress.
However, Internet Explorer (at least for IE7, not sure about later) does not support this, and it is an error to query responseText or responseBody for readyState 3. I have also heard the IE only calls onreadystatechange once with readyState 3 which would make it pretty useless for your purpose, but I would suggest testing it out.

Create a session. (Probably, you already have one).
Create a UID for the query, when you request it to start processing.
Store somewhere on the server (in database, in file, etc) a progress together with the SID+UID, as the query progresses.
Use a second ajax request with timer to poll the progress by SID+UID.
*You can get by only UID, probably, but I've found it to be more manageable when you also could monitor tasks by user/session.

Related

Getting real time feedback from a server process [in PHP]

Requirement:
I need to run a background process (per a user request) that takes about 30 to 60 seconds to complete. I'd like to give the user some status feedback. Note: Toly is right, 'Background' is not required.
What's working:
The process prints about 20 status messages during this time and I retrieve them with a proc_open and listening on a read pipe using fgets. I can save those messages into a session var and using timestamps (to help debug) I can see that the session array is getting written to with these messages as the process progresses.
The Trouble:
My plan was to poll the server with ajax calls (every sec) to retrieve these session vars for display in the DOM. The bottleneck seems to be that the server cannot service the ajax request while it's still running the background process. Everything dumps out at once when the background process completes. From what I can tell, the issue is not with output buffering because using (debugging) timestamps saved with each process message shows the server is writing to the session var sequentially, so that's how I know the proc_open and pipe reads are working as I expect. The issue appears to be the server not being able to give the AJAX request it's JSON object until it is done with the process; or, probably more accurately, done with the loop that is reading the pipe.
Obvious Misconception:
I thought sending a process to the background (using &) might give me a solution here. Apparently I do not know the difference between a background process and a forked process. What benefit is gained - if any - by running a process in the background when doing so appears to make no difference to me in this scenario?
Possible Solutions:
I do not expect the user initiated process that runs this
process/scenario to be that heavy, but if there's something I can
build into this solution that would help a heavy load then I would
like to do that now.
Is this a multi-threading (pthreads) or a
multi-process (fork) solution?
Or, should I save a process id,
let go polling it with a while( .. fgets ..) statement and then
come back to the process after the server has serviced the ajax
request?
I suppose I could run fake status messages and then
response accurately when the results come back after completion.
The time to process the request is not dependent upon the user, so
my fake timing could be pretty accurate. However, I would like to
know what the solution would be to provide real-time feedback.
After google-ing one day for a technique to get the same behavior you are describing here I come up with an easy solution for my project.
A bit of important theory:
- session_start () and a set like $_SESSION["x"] = "y" will always lock the session file.
Case scenario:
- A - process.php - running through an ajax call
- B - get_session.php - a second ajax call;
The main problem is/was, that even if you set a $_SESSION inside a process that is being run through an AJAX it will always have to wait the for the session file to get unlocked and it will result into a sync between the two processes (A. + B.) - both finishing at the same time!
So, the easiest way to fix this matter and get a good result is by using session_write_close() after each set. E.g.:
%_SESSION["A"] = "B";
$_SESSION["x"] = "y";
session_write_close();
PS: Best approach is to have a customed set of functions to handle the sessions.
Sorry for the mark-up. I just created an stack account.
Why would you think that you need a background process? Also, where did you get the idea that you needed one?
A normal php script, with sufficient time out set, with flush() function used every step of the way will give you the output you need for your AJAX.
What's even easier, since you use sessions - AJAX request to a separate handler, that will just check what's in session, and if there is smth new - will return you the new part.
$_SESSION['progress'] = array();
inside process.php
$_SESSION['progress'][] = 'Done 5%';
// complete some commands
$_SESSION['progress'][] = 'Done 10%';
inside ajax.php
if(count($_SESSION['progress']) > $_GET['laststep']) {
// echo the new messages
}
inside your normal page
$.ajax('ajax.php', 'GET', 'laststep=1', success: function(data){ show(data);})
Something like that should work.

Determinate update time

As I'm currently in the process of making a forum system which is loading new posts/edits without having to refresh the page. Now, for the older browers which don't have an implentation of EventSource/WebSocket, I'd like to offer another option:
Every X seconds I'm GET'ing a PHP site which is echoing the five latest news. Afterwards, I'm simply checking which of those news weren't seen by the client yet and applying the changes to the page.
Now, my problem is: How would you determinate the X interval in which the client is retrieving new updates? I'd like to base it up the user's connections so that it isn't killing off his connection completely.
What would be your attempt at accomplishing this?
I would use long polling technique through AJAX in your case:
1) The client sends the AJAX HTTP-request to the server.
2) If there is an available data, server sends HTTP-request to client, otherwise instead of sending an empty response immediately, server holds the request and waits for information to become available (or for a suitable timeout event - for example, in every 25 seconds), after which a complete response is finally sent to the client.
3) After recieving the HTTP-respose, client immediately sends other HTTP-request to server.
I would do the following (code not tested, but you should get the idea). Use jQuery for simpler code.
function refreshNews() {
$.ajax({
url: "ajax-url"
}).done(function(data){
/** add code here */
setTimeout(function(){ refreshNews(); }, 30000); // 30 secs should be enough to read some headlines
});
}
refreshNews();
This way the refreshNews() function is only called after the data is received and shown to the user.
Just an idea: make a HTTP request and see how much it will take long and use it as the base! I'd repeat it, let say each 10 minutes to show how much I'm thinking about my clients!
I think it will be more resource-friendly on the server-side comparing to the long polling, especially for scripts like forums where people won't left the page for less than 10 hours. :)

Percentage of process

I do a post with jQuery to a PHP script that do some tasks.
$.post("script.php", params, function(data) { // task done }
In script.php I know in each moment how much percentage of the complete task have been done. At the end, PHP writes a json_encode() string with the data that the first HTML receives and then completes the $.post call.
What I'd like is to tell (for example, each second) from script.php to the HTML the percentage of the tasks done (to print that in screen)
Is that possible ?
You can store the status of your php script server side with memcached or even a single text file.
Create a php that reads out this status, and return its value.
Now start your script, and also start the status checking, which you can run every 1000 ms.
It's technically possible, with a keep-alive connection, but it's complicated to implement.

How to break out of a hanging ajax call and load another page

I'm developing a invoice app. Currently i'm using OO php to build invoice-objects.
The objects themselves contain objects for customer, products, invoice details, firm.
Now i'm working on a page to make an overview. The problem that occurred was, when i had too many invoices (tested with only 1500 dummy invoices, which in time could be a lot more) the building of the php objects took around 7 seconds. I feel this is way too long since this is only for one request. Also since php runs serverside, the page didn't load anything before the objects were all built.
I was staring at an empty screen for 7 seconds and then got everything in an instant (all on localhost, so online it should be worse).
Since there needs to be more functionality to the page then just being an overview (i.e: creating new invoices, using filters to narrow invoices shown) and I don't want the user to need to wait for the invoices to be built before they can use the other functionality, i changed the way the page works.
Now I first load my basic html structure and only then start getting my invoice data using an $.ajax() call. I built an ajax_loader to notify the user that something is happening. When the call is done the data is shown, but I still have the issue that a user can't do anything. He can click a link/button or whatsoever but it doesn't 'act' until my ajax calls are complete. Once the calls are complete, everything works. Clicking on a link while there is an active call does get the 'click' event registered, but the trigger only happens when ajax is done.
The problem has nothing to do with my ajax calls being synced or not. If anyone has any suggestions on how to overcome this problem, i would much appreciate them.
My first thoughts would be canceling ajaxcalls but from what I've read up until now I suspect the abort() function won't get the job done.
edit:
Doing some more tryouts, I've noticed that everything works while the ajaxcalls are still running,except for loading a page from my own website (domain, server or however I should call it)
or doing any other ajaxcall that involves the same server
i.e:
$("#dummyButton").click(function(){
window.location='http://www.google.com' //works
window.location='index.php' //doesn't work
alert("test"); //works
console.log("test"); //works
})
a href='http://www.google.com' //works
a href='index.php' //doesnt work
So my guess is the server is busy building my invoice objects, hence it won't accept a new request.
The following adds to this conclusion:
console.log("start slow call");
slow = $.ajax({
a very slow/heavy call
success:function(){
console.log('end slow call');
}
});
console.log('start fast call');
quick = $.ajax({
a very quick/lightweight call
success: function(){
console.log('end fast call');
}
});
When I do these 2 calls at the same time, the quick call won't finish until the slow one is complete:
console prints:
start slow call
start fast call
end slow call
end fast call
doing both at the same time makes the quick call take 5 seconds(according to firebug), when doing only the quick call it completes in 150ms
I'd have guessed, before all this, that multiple ajaxcalls would be completed in a parallel way instead of serial.
abort() function: (having my 2 ajaxcalls as globals so i can abort them);
$("a[href]").click(function(){
slow.abort();
quick.abort();
window.location = "index.php";
return false;
})
makes the ajaxcalls to be aborted, but still the server keeps executing the code from both calls, hence it will not accept my request to go to index.php until the serversidecode from the ajaxcalls is complete.
after about 5 seconds (counting in the head) my index.php will start loading
Therefore the following question comes to mind:
Is there any manageable way to cancel all processes the server is running for that specific client?
Other thoughts that didn't end up as the root cause:
I've already adjusted my invoice constructor, passing it a boolean to determine if the object needs all the info, or only the basic one. This made my ajaxcall(or better the serverside process behind the specific ajaxcall) about 3 seconds shorter (on the 1500 dummy invoices). I could adjust the database and therefore a lot of other stuff on already developed stuff. Or, because building all the invoice objects is the timeconsuming part, i could just do it the non-OO way?
Which makes me kind of sad: this was my first real OOP project, its easy to work with, but apparently there's a lot of trouble on calculation time when dealing with a decent amount of objects.
Hey, I just had a really quick read on your issue - so I am hopping I am not way off here with my answer.
When you call two php requests at once and one won't finish before the other, than there is the chance that the fast request can not start the session (session_start()) before the slow request closes it.
Try closing the session if not needed any more before starting any long process.

Retrieve Database Value Without Page Load AJAX/PHP/Facebook Style

I am wondering if anyone could show, or explain with examples, how facebook checks its database for new messages? It seems to do it periodically, without loading the page. What would be the best way to achieve this in a PHP/MySQL/Jquery script?
Any help is always appreciated!
Cheers, Lea
you can do this: usign periodical updater
<span id="inbox-title"></span>
<script>
$.PeriodicalUpdater('/path/to/service', {
method: 'get', // method; get or post
minTimeout: 1000, // starting value for the timeout in milliseconds
maxTimeout: 8000, // maximum length of time between requests
}, function(data) {
$('#inbox-title').html('you have ' + data + 'new messages');
});
</script>
another option is to bind the onmousemove event and make the ajax call when than happes
There is actually a "page load", but it's a hidden request that doesn't reload the displayed page. Take a look at the jQuery Ajax command documentation for more details on one of the simplest ways to accomplish this (especially since you already mentioned using jQuery).
Have a look into reverse ajax with the COMET technique, this is a perfect use for it.
The idea behind it is to start an ajax request and let it timeout which could be 60 seconds, when it times out, start it again, here the browser has a (nearly) persistent connection to the server, if (for a simple example) a message gets created for a user. the server can reply to one of the hanging ajax requests that have been made (in this case by the recipient of the message).
No data is transfered while the xmlhttprequest and the server are waiting, but closing and reopening connections might be a burden on your server.

Categories