I want to mess around with realtime information, and there is a pretty standard functionality that I want to duplicate:
It occurs here on SO when you're on a single-question view, typing your answer, and an alert pops up top saying "there are 3 news answers, click to show"
It also occurs on Twitter "There are 5 new tweets in this search: click to update"
I'm pretty versed in server and client side code, and what I'm looking for is the basic outline (not even psuedo code, but perhaps plain english) of how something like this happens.
Is there a CRON job on the server running every minute, that shoots a signal to a long-polled AJAX bit on the page?
Does the page itself poll a server?
Any and all solutions are welcome. Thanks!
You can implement that using an AJAX call that runs on the client side at a regular interval using the Javascript setTimeout method. You'll have a Javascript function that calls your server side method that checks if an update has occurred, displays any update, then calls setTimeout to call itself.
pseudo code:
function updateCheck()
{
//make ajax call
//do something if any update has occurred
setTimeout("updateCheck()", 10000); //second param is in milliseconds
}
From the top of my head, I'd make it via javascript - setting timeouts to question the server. That's only an educated guess though.
Looks like SO uses a periodical updater to make an ajax request to a url like:
https://stackoverflow.com/posts/2307584/answer-activity-heartbeat
This returns a JSON result:
{"Result":false,"Count":0}
Heres an example of the result when a new answer exists:
{"Result":true,"Count":1}
Related
I have here an myAction function in some controller. And it has one class instance:
public function myAction() {
...
$myAnalyzer = new Analysis();
$myAnalyzer->analyze();
...
}
Let say this function analyze() takes 10 mins. That means it blocks my.phtml 10 mins, which is unacceptable. What I want is to render my.phtml first and then to show intermediate result from analyze() on my.phtml.
function analyze() {
...
foreach($items as $rv) {
...
...
// new result should be stored in db here
}
}
As far as I know, it's impossible, for there is just one thread in PHP. So I decided to ajax-call from my.phtml to run myAnalyzer instance.
First question: is that right? Can I do it in myAction() without blocking?
OK, now I run myAnalyzer using some script, say worker.php, from my.phtml with the help of javascript or JQuery.
Second question: how can I know when each foreach-loop ends? In other words, how can I let worker.php send some signal (or event) to my.phtml or zend framework. I do NOT want to update my.phtml on a time basis using javascript timer. That's all that I need to know, since intermediate data is supposed to be stored in DB.
Third question: the myAnalyzer muss stop, when user leaves pages. For that I have this code.
window.onbeforeunload = function(e) {
// killer.php kills myAnalyzer
};
But how can javascript communicate with myAnalyzer? Is there something like process-id? I mean, when worker.php runs myAnalyzer, it registers its process-id in zend framework. And when user leave page, killer.php stops myAnalyzer using this process-id.
I appreciate the help in advance.
First Q.: Yeah, I'm afraid that is correct.
Second Q.: I do not understand what do you mean here. See code example below
foreach($data => $item) {
...
}
//code here will be executed only after foreach loop is done.
Third Q.: Take a look at this page. You can set this to false (But I suppose it is already like that) and send something to client from time to time. Or you can set it to true and check if user is still connected with connection_aborted function. What I mean here is that you can run your worker.php with ajax and configure your request so browser will not disconnect it because of timeout (so connection will be kept while user stay on page). But it will be closed if user leave the page.
EDIT:
About second question. There are few options:
1) you may use some shared memory (like memcached, for instance). And call server with another ajax request from time to time. So after each loop is ended - you put some value into memcached and during request you can check that value and build response/update your page based on that value
2) There is such thing like partial response. It is possible to get some piece of response with XMLHTTPRequest, but as I remember - that is not really useful at this moment as it is not supported by many browsers. I do not have any details about this. Never tried to use it, but I know for sure that some browsers allow to process portions of response with XMLHTTPRequest.
3) You can use invisible iframe to call your worker.php instead of XMLHTTPRequest. In this case you can send some piece of where you can put a javascript which will call some function in parrent window and that function will update your page. That is one of Long-polling COMET implementations if you want to get some more information. There are some pitfalls (for instance, you may need to ensure that you are sending some specyfic amount of symbols in response in order to get it executed in some browser), but it is still possible to use (some web browser chats are based on this).
2) and 3) is also good because it will solve your third question problem automatically. At the same time 1) may be simpler, but it will not solve a problem in third question.
One more thing - as you will have long running script you must remember that session may block execution of any other requests (if default file based PHP session is used - this will happen for sure)
I'm working on a simple PHP application, using CouchDB and PHP-on-Couch to access some views, and it's working great. My next step is to introduce Ajax to update the frontend with data from the database.
I understand you can use the _changes notifications to detect any changes made on the database easily enough. So, its a matter of index.html monitoring for changes (through long polling), which calls loadView.php to update the page content.
Firstly, I hope the above is the correct method of going about it...
Secondly, when browsing to index.html, the page seems to never fully load (page load bar never completes). When a change is made, Firebug shows a the results as expected, but not any subsequent changes. At this time, the page seems to have stopped the infinite loading.
So far, i'm using jQuery to make the Ajax call...
$.getJSON('http://localhost:5984/db?callback=?', function(db) {
console.log(db.update_seq);
$.getJSON('http://localhost:5984/db/_changes?since='+db.update_seq+'&feed=continuous&callback=?', function(changes) {
console.log(changes);
});
});
Any ideas what could be happening here?
I believe the answer is simple enough.
A longpoll query is AJAX, guaranteed to respond only once, like fetching HTML or an image. It may take a little while to respond while it waits for a change; or it may reply immediately if changes have already happened.
A continuous query is COMET. It will never "finish" the HTTP reply, it will keep the connection open forever (except for errors, crashes, etc). Every time a change happens, zoom, Couch sends it to you.
So in other words, try changing feed=longpoll to feed=continuous and see if that solves it.
For background, I suggest the CouchDB Definitive Guide on change notifications and of course the excellent Couchbase Single Server changes API documentation.
People,
I am developing a web page that need to be refresh everytime that the data base gets an update. I already have the checkDatabaseUpdate() done in my PHP code.
But now I reaaly need some help to develop a simple comet to wait for a response, and another to check for update.
Is there anybody with any simple example to help me?
Is comet the right solution for that?
Thanks,
What you want to say is that on the database are executed querys (INSERT, UPDATE, DELETE) in the backend and you want to refresh the front page of a user when that query`s are executed ?
Hmm .. use a jQuery (looping) to "Ajax check" for database update in the frontcontroller and then refresh.
function refreshPage () {
$.load('checkModifDb.php', function(response, status) {
if .... { do the trick here - check jquery load.api }
}
});
and then use setInterval( "refreshPage()", 10000 ); to run the function every 10 seconds and
refresh only if it founds that db was modified.
I can't think of anything right now but i guess with little modification you shoul do the trick. This is how twitter.com do it.
Is comet the right solution for that?
Because of the way that PHP works (having a web server daemon process incoming requests), combining it with long-polling techniques can make for an unhappy server. Each connected user is going to hold open a connection to the web server daemon. Depending on that daemon's configuration, you may find that comet is an effective denial of service attack against your own server.
You'd probably be better off with plain old short-lived ajax polling here.
I'm developing a invoice app. Currently i'm using OO php to build invoice-objects.
The objects themselves contain objects for customer, products, invoice details, firm.
Now i'm working on a page to make an overview. The problem that occurred was, when i had too many invoices (tested with only 1500 dummy invoices, which in time could be a lot more) the building of the php objects took around 7 seconds. I feel this is way too long since this is only for one request. Also since php runs serverside, the page didn't load anything before the objects were all built.
I was staring at an empty screen for 7 seconds and then got everything in an instant (all on localhost, so online it should be worse).
Since there needs to be more functionality to the page then just being an overview (i.e: creating new invoices, using filters to narrow invoices shown) and I don't want the user to need to wait for the invoices to be built before they can use the other functionality, i changed the way the page works.
Now I first load my basic html structure and only then start getting my invoice data using an $.ajax() call. I built an ajax_loader to notify the user that something is happening. When the call is done the data is shown, but I still have the issue that a user can't do anything. He can click a link/button or whatsoever but it doesn't 'act' until my ajax calls are complete. Once the calls are complete, everything works. Clicking on a link while there is an active call does get the 'click' event registered, but the trigger only happens when ajax is done.
The problem has nothing to do with my ajax calls being synced or not. If anyone has any suggestions on how to overcome this problem, i would much appreciate them.
My first thoughts would be canceling ajaxcalls but from what I've read up until now I suspect the abort() function won't get the job done.
edit:
Doing some more tryouts, I've noticed that everything works while the ajaxcalls are still running,except for loading a page from my own website (domain, server or however I should call it)
or doing any other ajaxcall that involves the same server
i.e:
$("#dummyButton").click(function(){
window.location='http://www.google.com' //works
window.location='index.php' //doesn't work
alert("test"); //works
console.log("test"); //works
})
a href='http://www.google.com' //works
a href='index.php' //doesnt work
So my guess is the server is busy building my invoice objects, hence it won't accept a new request.
The following adds to this conclusion:
console.log("start slow call");
slow = $.ajax({
a very slow/heavy call
success:function(){
console.log('end slow call');
}
});
console.log('start fast call');
quick = $.ajax({
a very quick/lightweight call
success: function(){
console.log('end fast call');
}
});
When I do these 2 calls at the same time, the quick call won't finish until the slow one is complete:
console prints:
start slow call
start fast call
end slow call
end fast call
doing both at the same time makes the quick call take 5 seconds(according to firebug), when doing only the quick call it completes in 150ms
I'd have guessed, before all this, that multiple ajaxcalls would be completed in a parallel way instead of serial.
abort() function: (having my 2 ajaxcalls as globals so i can abort them);
$("a[href]").click(function(){
slow.abort();
quick.abort();
window.location = "index.php";
return false;
})
makes the ajaxcalls to be aborted, but still the server keeps executing the code from both calls, hence it will not accept my request to go to index.php until the serversidecode from the ajaxcalls is complete.
after about 5 seconds (counting in the head) my index.php will start loading
Therefore the following question comes to mind:
Is there any manageable way to cancel all processes the server is running for that specific client?
Other thoughts that didn't end up as the root cause:
I've already adjusted my invoice constructor, passing it a boolean to determine if the object needs all the info, or only the basic one. This made my ajaxcall(or better the serverside process behind the specific ajaxcall) about 3 seconds shorter (on the 1500 dummy invoices). I could adjust the database and therefore a lot of other stuff on already developed stuff. Or, because building all the invoice objects is the timeconsuming part, i could just do it the non-OO way?
Which makes me kind of sad: this was my first real OOP project, its easy to work with, but apparently there's a lot of trouble on calculation time when dealing with a decent amount of objects.
Hey, I just had a really quick read on your issue - so I am hopping I am not way off here with my answer.
When you call two php requests at once and one won't finish before the other, than there is the chance that the fast request can not start the session (session_start()) before the slow request closes it.
Try closing the session if not needed any more before starting any long process.
I am looking for a way to start a function on form submit that would not leave the browser window waiting for the result.
Example:
User fills in the form and press submit, the data from the form via javascript goes to the database and a function in php that will take several seconds will start but I dont want the user to be left waiting for the end of that function. I would like to be able to take him to another page and leave the function doing its thing server side.
Any thoughts?
Thanks
Thanks for all the replies...
I got the ajax part. But I cannot call ajax and have the browser move to another page.
This is what I wanted.
-User fills form and submits
-Result from the form passed to database
-long annoying process function starts
-user carries on visiting the rest of the site, independent of the status on the "long annoying process function"
By the way and before someone suggests it. No, it cannot be done by cron job
Use AJAX to call the php script, and at the top of the script turn on ignore_ user_ abort.
ignore_user_abort(true);
That way if they navigate away from the page the script will continue running in the backround. You can also use
set_time_limit(0);
to set a time limit, useful if you know your script will take a while to complete.
The most common method is:
exec("$COMMAND > /dev/null 2>&1 &");
Ah, ok, well you're essentially asking therefore, does PHP support threading, and the general answer is no... however...
there are some tricks you can perform to mimick this behaviour, one of which is highlighted above and involves forking out to a separate process on the server, this can be acheived in a number of ways, including the;
exec()
method. You also may want to look here;
PHP threading
I have also seen people try to force a flush of the output buffer halfway through the script, attempting to force the response back to the client, I dont know how successful this approach is, but maybe someone else will have some information on that one.
This is exactly what AJAX (shorthand for asynchronous JavaScript + XML) is for;
AJAX Information
It allows you to code using client side code, and send asynchronous requests to your server, such that the user's browser is not interuppted by an entire page request.
There is alot of information relating to AJAX out there on the web, so take a deep breath and get googling!
Sounds like you want to use some of the features AJAX (Asynchronous Javascript and XML - google) have to offer.
Basically, you would have a page with content. When a user clicks a button, javascript would be used to POST data to the server and begin processing. Simultaneously, that javascript might load a page from the server and then display it (eg, load data, and then replace the contents of a DIV with that new page.)
This kind of thing is the premise behind AJAX, which you see everywhere when you have a web page doing multiple things simultaneously.
Worth noting: This doesn't mean that the script is running "in the background on the server." Your web browser is still maintaining a connection with the web server - which means that the code is running in the "background" on the client's side. And by "background" we really mean "processing the HTTP request in parallel with other HTTP requests to give the feel of a 'background' running process"