I'm using an Ajax function with a call to itself to update the information continuously. But I let the script run for a while and then the server blocked my IP because it thought I was flooding it or something like that, I don't know. Anyway, I wonder if there's another way to do this more properly. Here's my code:
Ajax function:
function update_cart()
{
if (window.XMLHttpRequest)
var http = new XMLHttpRequest();
else
var http = new ActiveXObject('Microsoft.XMLHTTP');
http.onreadystatechange=function()
{
if ((http.readyState == 4) && (http.status == 200))
{
id('cart_quantity').innerHTML = parseInt(http.responseText);
setTimeout('update_cart()', 1000);
}
}
http.open('GET', actual_path+'fetch_cart_quantity.php', true);
http.setRequestHeader("X-Requested-With", "XMLHttpRequest");
http.send();
}
PHP script:
<?php
if($_SERVER['HTTP_X_REQUESTED_WITH'] != 'XMLHttpRequest')
{
header('Location: ./');
exit();
}
session_start();
include '../include/config.php';
include '../include/db_handler.php';
include '../include/cart_handler.php';
$cart = get_cart_quantity($_SESSION['cart_id']);
if ($cart == NULL) $cart = 0;
echo $cart;
?>
Thanks in advance for your help. Sorry that my English is not very good.
You have a couple of options here as I stated in my comment.
Basically the first is to chill out with the querying. You don't need to long poll such a thing. Turn down the querying to once every 5 mins or just whenever there is an action.
You can also build a simple comet server to do a push pull type thing when ever updates are "pushed" down from the server. There is a pre-built one called APE: http://www.ape-project.org/
Also node.js can handle this sort of thing for you.
Also you should probably look into your server setup, sounds kinda weird that your sever is blocking it's own IP address/domain...
Probaly, but i seriously think that querying your server every second is totally innecesary and also a waste of resources (unless you have very compulsive customers), making your script to query your server every minute or so is better and may work even better in the long term if you have several customers using this application from the same server.
If you really think is necessary to have this feature, a good approach will be using Push notifications, more info can be found here: PHP - Push Notifications, here: Push notification to the client browser and here: Push notifications from server to user with PHP/JavaScript.
Why don't you just update the cart when the user performs an action.
Such as 'Add item to shopping cart'?
That way you'd only call the server when it's actually needed.
Related
I have implemented the long polling successfully using normal Apache server, PHP, AJAX and Javascript. I don't use the Jquery to communicate with the server.
The problem is that the Apache server capabilities are limited, the server is not able to serve more than 5 browser tabs.
I wonder if there is any customization for the Apache or for the PHP to make them handle more concurrent connections? Or if there is any new/smart technique to do that? What are the maximum threads can be handled by a robust web server specialized in long polling?
I am not interested in the Web Sockets because of the browsers compatibility. I need something easy and robust into PHP. What Facebook are doing? I wonder how can they handle all the dynamic updates for million of users! What products/techniques they use?
A sample of my code:
srv_polling.php
<?php
function getResults(){..... return result;}
// recursive function inside the server
function hasResultChanged($old,$timeStart){
// to avoid server timeout (in seconds) in case no change for results
if(round(abs(time() - $timeStart) / 60*60,2) > 50)
return;
$new = getResults();
if($new != $old) // get back to browser
return true;
else{
$old = getResults();
sleep(2);
return $hasResultChanged($old,$timeStart);
}
}
$timeStart = time();
$old = $getResults();
sleep(2);
$hasResultChanged($old,$timeStart);
?>
// Javascript code to be executed at browser end
alert('Result has changed');
// Send AJAX request again to same page(srv_polling.php):
ajax.call({......})
Thank you for your hints! Greatly appreciated.
I am using this in my project
public function getLPollData($user, $handlerName) {
set_time_limit (600);
date_default_timezone_set('Europe/Berlin');
$counterEnd = (int)$_REQUEST["counterEnd"];
$counterStart = (int)$_REQUEST["counterStart"];
$this->expireNotifications($counterStart, $counterEnd);
$secCount = IDLE_WAIT;
do {
sleep(IDLE_TIME);
$updates = $this->fetchAllNotifications($counterEnd);
} while (!$updates && ($secCount--)>0);
if($updates){
}
header("HTTP/1.0 200");
return sprintf ('{"time" : "%s", "counter" : "%d", start : %d, data : %s}'
, date('d/m H:i:s'), $counterEnd,$counterStart,json_encode($updates));
}
Its combination of IDLE_WAIT & IDLE_TIME (10*3=~30 secs).
But i dont think your problem is on server side, if you are opening 5-6 connections from a browser, then remember each browser has limitation on how many active connections it can have to some particular domain, at a time. try diff browser or better diff machines, max two tabs in one browser.
I have to scrap a web site where i need to fetch multiple URLs and then process them one by one. The current process somewhat goes like this.
I fetch a base URL and get all secondary URLs from this page, then for each secondary url I fetch that URL, process found page, download some photos (which takes quite a long time) and store this data to database, then fetch next URL and repeat the process.
In this process, I think I am wasting some time in fetching secondary URL at the start of each iteration. So I am trying to fetch next URLs in parallel while processing first iteration.
The solution in my mind is, from main process call a PHP script, say downloader, which will download all the URL (with curl_multi or wget) and store them in some database.
My questions are
How to call such downloder asynchronously, I don't want my main script to wait till downloder completes.
Any location to store downloaded data, such as shared memory. Of course, other than database.
There any chances that data gets corrupt while storing and retrieving, how to avoid this?
Also, please guide me know if anyone have a better plan.
When I hear someone uses curl_multi_exec it usually turns out they just load it with, say, 100 urls, then wait when all complete, and then process them all, and then start over with the next 100 urls... Blame me, I was doing so too, but then I found out that it is possible to remove/add handles to curl_multi while something is still in progress, And it really saves a lot of time, especially if you reuse already open connections. I wrote a small library to handle queue of requests with callbacks; I'm not posting full version here of course ("small" is still quite a bit of code), but here's a simplified version of the main thing to give you the general idea:
public function launch() {
$channels = $freeChannels = array_fill(0, $this->maxConnections, NULL);
$activeJobs = array();
$running = 0;
do {
// pick jobs for free channels:
while ( !(empty($freeChannels) || empty($this->jobQueue)) ) {
// take free channel, (re)init curl handle and let
// queued object set options
$chId = key($freeChannels);
if (empty($channels[$chId])) {
$channels[$chId] = curl_init();
}
$job = array_pop($this->jobQueue);
$job->init($channels[$chId]);
curl_multi_add_handle($this->master, $channels[$chId]);
$activeJobs[$chId] = $job;
unset($freeChannels[$chId]);
}
$pending = count($activeJobs);
// launch them:
if ($pending > 0) {
while(($mrc = curl_multi_exec($this->master, $running)) == CURLM_CALL_MULTI_PERFORM);
// poke it while it wants
curl_multi_select($this->master);
// wait for some activity, don't eat CPU
while ($running < $pending && ($info = curl_multi_info_read($this->master))) {
// some connection(s) finished, locate that job and run response handler:
$pending--;
$chId = array_search($info['handle'], $channels);
$content = curl_multi_getcontent($channels[$chId]);
curl_multi_remove_handle($this->master, $channels[$chId]);
$freeChannels[$chId] = NULL;
// free up this channel
if ( !array_key_exists($chId, $activeJobs) ) {
// impossible, but...
continue;
}
$activeJobs[$chId]->onComplete($content);
unset($activeJobs[$chId]);
}
}
} while ( ($running > 0 && $mrc == CURLM_OK) || !empty($this->jobQueue) );
}
In my version $jobs are actually of separate class, not instances of controllers or models. They just handle setting cURL options, parsing response and call a given callback onComplete.
With this structure new requests will start as soon as something out of the pool finishes.
Of course it doesn't really save you if not just retrieving takes time but processing as well... And it isn't a true parallel handling. But I still hope it helps. :)
P.S. did a trick for me. :) Once 8-hour job now completes in 3-4 mintues using a pool of 50 connections. Can't describe that feeling. :) I didn't really expect it to work as planned, because with PHP it rarely works exactly as supposed... That was like "ok, hope it finishes in at least an hour... Wha... Wait... Already?! 8-O"
You can use curl_multi: http://www.somacon.com/p537.php
You may also want to consider doing this client side and using Javascript.
Another solution is to write a hunter/gatherer that you submit an array of URLs to, then it does the parallel work and returns a JSON array after it's completed.
Put another way: if you had 100 URLs you could POST that array (probably as JSON as well) to mysite.tld/huntergatherer - it does whatever it wants in whatever language you want and just returns JSON.
Aside from the curl multi solution, another one is just having a batch of gearman workers. If you go this route, I've found supervisord a nice way to start a load of deamon workers.
Things you should look at in addition to CURL multi:
Non-blocking streams (example: PHP-MIO)
ZeroMQ for spawning off many workers that do requests asynchronously
While node.js, ruby EventMachine or similar tools are quite great for doing this stuff, the things I mentioned make it fairly easy in PHP too.
Try execute from PHP, python-pycurl scripts. Easier, faster than PHP curl.
I have a PHP script (let's call it execute.php) that draws the whole page (HTML tags and body tags etc.) at the beginning and, afer that, executes some commands (C++ programs) in the background. It then waits for these programs to terminate (some depend on the results of others, so they may be executed sequentially) and then has a JavaScript that auto-submits a form to another PHP script (which we will call results.php) because results.php needs the POST-information from the previous script.
execute.php:
<?php
print"
<html>
<body>
Some HTML code here
</body>
</html>
";
// Here come some C++-program calls
$pid_program1 = run_in_background($program1)
$pid_program2 = run_in_background($program2)
while (is_running($pid_program1) or is_running($pid_program2) )
{
//echo(".");
sleep(1);
}
// Here come some later C++-program calls that execute quickly
$pid_program3 = run_in_background($program3)
$pid_program4 = run_in_background($program4)
while (is_running($pid_program3) or is_running($pid_program4) )
{
sleep(1);
}
...
// We are now finished
print "
<form action=\"results.php\" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
<script type=\"text/javascript\">
AutoSubmitForm( 'go_to_results' );
</script>
";
This works nicely if the C++ programs 1 and 2 execute quickly. However, when they take their time (around 25 minutes in total), the PHP script seems to fail to continue. Interestingly the C++ programs 3 and 4 are nevertheless executed and produce the expected outputs etc.
However, when I put a echo("."); in the first while-loop before the sleep(), it works and continues until the JavaScript autosubmit.
So it seems to me that the remaining PHP code (including the autosubmit) is, for whatever reason, not send when there is no output in the first while loop.
I have also tried using set_time_limit(0) and ignore_user_abort(true) and different other things like writing to an outputbuffer (don't want to clutter the already finally displayed webpage) instead of the echo, but none of these work.
When I run the same scripts on a machine with multiple cores, so that program1 and 2 can be executed in parallel, it also works, without the echo(".").
So I am currently very confused and can't find any error messages in the apache log or PHP log and thus would really appreciate your thoughts on this one.
EDIT
Thanks again for your suggestions so far.
I have now adopted a solution involving (really simple) AJAX and it's definitely nicer this way.
However, if the C++-programs executions take "longer" it is not autosubmitting to the results-page, which is actually created this time (failed to do so before).
Basically what I have done is:
process.php:
<?php
$params = "someparam=1";
?>
<html>
<body>
<script type="text/javascript">
function run_analyses(params){
// Use AJAX to execute the programs independenantly in the background
// Allows for the user to close the process-page and come back at a later point to the results-link, w/o need to wait.
if (window.XMLHttpRequest)
{
http_request = new XMLHttpRequest();
}
else
{
//Fallback for IE5 and IE6, as these don't support the above writing/code
http_request = new ActiveXObject("Microsoft.XMLHTTP");
}
//Is http_request still false
if (!http_request)
{
alert('Ende :( Kann keine XMLHTTP-Instanz erzeugen');
}
http_request.onreadystatechange=function(){
if (http_request.readyState==4 && http_request.status==200){
// Maybe used to display the progress of the execution
//document.getElementById("output").innerHTML=http_request.responseText;
// Call of programs is finished -> Go to the results-page
document.getElementById( "go_to_results" ).submit();
}
};
http_request.open("POST","execute.php",true);
//Send the proper header information along with the request
http_request.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http_request.setRequestHeader("Content-length", params.length);
http_request.setRequestHeader("Connection", "close");
http_request.send(params);
};
</script>
<?php
// Do some HTML-markup
...
// Start the programs!
print "
<script type=\"text/javascript\">
run_analyses('".$params."');
</script>
<form action=\"results.html" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
?>
</html>
</body>
and execute.php contains the C++-program calls, waiting-routines and finally, via "include("results.php")" the creation of the results-page.
Again, for "not so long" program executions, the autosubmission works as expected, but not if it takes "longer". By "longer" I mean around 25 minutes.
I have absolutely no idea what could cause this as again, there are no error-messages to be found.
Am I missing a crucial configuration option there (apache, php, etc.)?
EDIT
As it turned out, letting the requested PHP-script "echo" something repeatedly prevents the timeout. So it is basically the same as for the PHP-solution without AJAX, but this time, since the responseText of the AJAX-request is not necessarily needed, the progress-page is not cluttered and it may be used as a workaround. Specifically, I would not necessarily recommend it a as a general solution or good-practice.
It occurs to me that a better approach would be to:
Output the complete HTML page
Show a loading message to the user
Send an AJAX request to start the external program
Wait for callback (waiting for external program to finish)
Repeat steps 3 and 4 until all program have been executed
Update the page to tell the user what is going on
Submit the form
This way, you get the HTML to the user as quickly as possible, then you execute the programs sequentially in an orderly and controlled fashion without worrying about hitting the max_execution_time threshold. This also enables you to keep your user informed - after each AJAX callback, you can tell the user that "program ABC has completed, starting DEF..." and so on.
EDIT
Per request, I'll add an outline of how this could be implemented. A caveat, too: If you are going to be adding more javascript-derived functionality to your page, you'll want to consider using a library like jQuery or mootools (my personal favorite). This is a decision you should make right away - if you aren't going to be doing a lot of javascript except this, then a library will only bloat your project, but if you are going to be adding a lot of javascript, you don't want to have to come back later and re-write your code because you add a library 3/4 of the way through the project.
I've used mootools to create this demonstration, but it isn't necessary or even advisable to add mootools if this is the only thing you're going to use it for. It is simply easier for me to write an example really quick without having to stop and think :)
First, the main page. We'll call this page view.php. This should contain your initial HTML as well as the javascript that will fire off the AJAX requests. Basically, this entire jsFiddle would be view.php: http://jsfiddle.net/WPnEy/1/
Now, execute.php looks like this:
$program_name = isset($_POST['program_name']) ? $_POST['program_name'] : false;
switch ($program_name) {
case 'program1':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 1';
break;
case 'program2':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 2';
break;
case 'program3':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 3';
break;
case 'program4':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 4';
break;
default:
die(json_encode(array(
'program_name'=>'Invalid',
'status'=>'FAILED',
'error'->true,
'error_msg'=>'Invalid program'
)));
break;
}
$pid = run_in_background($program_path)
while (is_running(pid)) {
sleep(1);
}
// check here for errors, get any error messages you might have
$error = false;
$error_msg = '';
// use this for failures that are not necessarily errors...
$status = 'OK';
die(json_encode(array(
'program_name'=>$friendly_name,
'status'=>$status,
'error'->$error,
'error_msg'=>$error_msg
)));
execute.php would then be called once for each program. The $friendly_program variable gives you a way to send back something for the user to see. The switch statement there makes sure that the script isn't being asked to execute anything you aren't expecting. The program is executed, and when it is done you send along a little package of information with the status, the friendly name, any errors, etc. This comes into the javascript on view.php, which then decides if there are more programs to run. If so, it will call execute.php again... if not, it will submit the form.
This seems rather convoluted... And very risky. Any network glitches, the user's browser closing for whatever reason, and even a firewall timing out, and this script is aborted.
Why not run the whole thing in the background?
<?php
session_start();
$_SESSION['background_run_is_done'] = false;
session_write_close(); // release session file lock
set_time_limit(0);
ignore_user_abort(true); // allow job to keep running even if client disconnects.
.... your external stuff here ...
if ($successfully_completed) {
session_start(); // re-open session file to update value
$_SESSION['background_run_is_done'] = TRUE;
}
... use curl to submit job completion post here ...
?>
This disconnects the state of the user's browser from the processing of the jobs. You then just have your client-side code ping the server occasionally to monitor the job's progress.
Launching and managing multiple and long-running processes from a webserver PHP process is fraught with complications and complexity. It's also very different on different platforms (you didn't say which you are using).
Handling the invocation of these processes synchronously from the execution of your PHP is not the way to address this. You really need to run the programs in a seperate session group - and use (e.g.) Ajax or Comet to poll the status of them.
here is my problem: I have a script (let's call it comet.php) whic is requsted by an AJAX client script and wait for a change to happen like this:
while(no_changes){
usleep(100000);
//check for changes
}
I don't like this too much, it's not very scalable and it's (imho) "bad practice"
I would like to improve this behaviour with a semaphore(?) or anyway concurrent programming
technique. Can you please give me some tips on how to handle this? (I know, it's not a short answer, but a starting point would be enough.)
Edit: what about LibEvent?
You can solve this problem using ZeroMQ.
ZeroMQ is a library that provides supercharged sockets for plugging things (threads, processes and even separate machines) together.
I assume you're trying to push data from the server to the client. Well, a good way to do that is using the EventSource API (polyfills available).
client.js
Connects to stream.php through EventSource.
var stream = new EventSource('stream.php');
stream.addEventListener('debug', function (event) {
var data = JSON.parse(event.data);
console.log([event.type, data]);
});
stream.addEventListener('message', function (event) {
var data = JSON.parse(event.data);
console.log([event.type, data]);
});
router.php
This is a long-running process that listens for incoming messages and sends them out to anyone listening.
<?php
$context = new ZMQContext();
$pull = $context->getSocket(ZMQ::SOCKET_PULL);
$pull->bind("tcp://*:5555");
$pub = $context->getSocket(ZMQ::SOCKET_PUB);
$pub->bind("tcp://*:5556");
while (true) {
$msg = $pull->recv();
echo "publishing received message $msg\n";
$pub->send($msg);
}
stream.php
Every user connecting to the site gets his own stream.php. This script is long-running and waits for any messages from the router. Once it gets a new message, it will output this message in EventSource format.
<?php
$context = new ZMQContext();
$sock = $context->getSocket(ZMQ::SOCKET_SUB);
$sock->setSockOpt(ZMQ::SOCKOPT_SUBSCRIBE, "");
$sock->connect("tcp://127.0.0.1:5556");
set_time_limit(0);
ini_set('memory_limit', '512M');
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
while (true) {
$msg = $sock->recv();
$event = json_decode($msg, true);
if (isset($event['type'])) {
echo "event: {$event['type']}\n";
}
$data = json_encode($event['data']);
echo "data: $data\n\n";
ob_flush();
flush();
}
To send messages to all users, just send them to the router. The router will then distribute that message to all listening streams. Here's an example:
<?php
$context = new ZMQContext();
$sock = $context->getSocket(ZMQ::SOCKET_PUSH);
$sock->connect("tcp://127.0.0.1:5555");
$msg = json_encode(array('type' => 'debug', 'data' => array('foo', 'bar', 'baz')));
$sock->send($msg);
$msg = json_encode(array('data' => array('foo', 'bar', 'baz')));
$sock->send($msg);
This should prove that you do not need node.js to do realtime programming. PHP can handle it just fine.
Apart from that, socket.io is a really nice way of doing this. And you could connect to socket.io to your PHP code via ZeroMQ easily.
See also
ZeroMQ
ZeroMQ PHP Bindings
ZeroMQ is the Answer - Ian Barber (Video)
socket.io
It really depends on what you are doing in your server side script. There are some situations in which your have no option but to do what you are doing above.
However, if you are doing something which involves a call to a function that will block until something happens, you can use this to avoid racing instead of the usleep() call (which is IMHO the part that would be considered "bad practice").
Say you were waiting for data from a file or some other kind of stream that blocks. You could do this:
while (($str = fgets($fp)) === FALSE) continue;
// Handle the event here
Really, PHP is the wrong language for doing stuff like this. But there are situations (I know because I have dealt with them myself) where PHP is the only option.
As much as I like PHP, I must say that PHP isn't the best choice for this task.
Node.js is much, much better for this kind of thing and it scales really good. It's also pretty simple to implement if you have JS knowledge.
Now, if you don't want to waste CPU cycles, you have to create a PHP script that will connect to a server of some sort on a certain port. The specified server should listen for connections on the chosen port and every X amount of time check for whatever you want to check (db entries for new posts for example) and then it dispatches the message to every connected client that the new entry is ready.
Now, it's not that difficult to implement this event queue architecture in PHP, but it'd take you literally 5 minutes to do this with Node.js and Socket.IO, without worrying whether it'll work in majority of browsers.
I agree with the consensus here that PHP isn't the best solution here. You really need to be looking at dedicated realtime technologies for the solution to this asynchronous problem of delivering data from your server to your clients. It sounds like you are trying to implement HTTP-Long Polling which isn't an easy thing to solve cross-browser. It's been tackled numerous times by developers of Comet products so I'd suggest you look at a Comet solution, or even better a WebSocket solution with fallback support for older browsers.
I'd suggest that you let PHP do the web application functionality that it's good at and choose a dedicated solution for your realtime, evented, asynchronous functionality.
You need a realtime library.
One example is Ratchet http://socketo.me/
The part that takes care of the pub sub is discussed at http://socketo.me/docs/wamp
The limitation here is that PHP also needs to be the one to initiate the mutable data.
In other words this wont magically let you subscribe to when MySQL is updated. But if you can edit the MySQL-setting code then you can add the publish part there.
I am new to this site, so I really hope I will provide all the necessary information regarding my question.
I've been trying to create a "new message arrived notification" using long polling. Currently I am initiating the polling request by window.onLoad event of each page in my site.
On the server side I have an infinite loop:
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
echo $newMessageCount;
On the client side I have the following (simplified) ajax functions:
poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
got_new_message_count(){
if (xmlhttp.readyState==4){
updateMessageCount(xmlhttp.responseText);
//...
poll_new_messages();
}
}
The problem is that with each page load, the above loop starts again. The result is multiple infinite loops for each user that eventually make my server hang.
*The NewMessageArived() function queries MySQL DB for new unread messages.
*At the beginning of the php script I run start_session() in order to obtain the $current_user value.
I am currently the only user of this site so it is easy for me to debug this behavior by writing time() to a file inside this loop. What I see is that the file is being written more often than once in 10 seconds, but it starts only when I go from page to page.
Please let me know if any additional information might help.
Thank you.
I think I found a solution to my problem. I would appreciate if anyone could tell, if this is the technique that is being used in COMET and how scalable this solution.
I used a user based semaphore like this:
$sem_id = sem_get($current_user);
sem_acquire($sem_id);
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
sem_release($sem_id);
echo $newMessageCount;
It seems common for long-polling requests to timeout after 30 seconds. So in your while loop you could echo 'CLOSE' after 30 seconds.
while(!$new_message && $timer < 30){
$new_message = NewMessageArrived($current_user);
if(!$new_message) {
sleep(10);
$timer += 10;
}
}
if($newMessageCount) {
echo $newMessageCount;
} else {
echo 'CLOSE';
}
In the Javascript, you can listen for the CLOSE.
function poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
function got_new_message_count(){
if (xmlhttp.readyState==4){
if(xmlhttp.responseText != 'CLOSE') {
updateMessageCount(xmlhttp.responseText);
}
//...
poll_new_messages();
}
}
Now, the PHP will return a response within 30 seconds, no matter what. If you use stays on the page, and you receive a CLOSE, you just don't update the count on the page, and re-ask.
If the user moves to a new page, your PHP instance will stop the loop regardless within 30 seconds, and return a response. Being on a new page though, the XHR that cared about that connection no longer exists, so it won't start up another loop.
You might try checking connection_aborted() periodically. Note that connection_aborted() might not pick up on the fact that the connection has in fact been aborted until you've written some output and done a flush().
In fact, just producing some output periodically may be sufficient for php to notice the connection close itself, and automatically kill your script.