Long polling server loop - php

I am trying to make a notification system of incoming messages. For examlpe user stays on the page and he recieves in a block that he has 2 unread messages.
My client code:
(function getmess(){
var id = '<?=$MY_ID;?>';
$.ajax({
url:"notif.php",
data:{"id":id,},
type:"GET",
success: function(result){
$("#count").html(result);
}, dataType: "json",
complete: getmess,
timeout: 10000});
})();
My server code:
<?php
$mysqli = new mysqli('localhost', 'root', '', 'lc');
if (mysqli_connect_errno()) {
printf("error: %s\n", mysqli_connect_error());
exit;
}
$MY_ID = $_POST['id'];
while (true) {
$result = $mysqli->query("SELECT COUNT(*) FROM messages WHERE user_get='$MY_ID' AND status='0' ");
if (mysqli_num_rows($result)) {
while ($row = mysqli_fetch_array($result)) {
echo $row[0]."";
}
exit;
}
sleep(5);
}
Everything works, but I have the problem that ajax requests are sent every 1 second and the lengh of these request is 1 second, and they overload the server.
I want to wait for a response from the server for at least 10 seconds in the case of a successful response to immediately send a new request, as well if the server does not respond within 10 seconds, ie in the database there is no change, then send a new query length of 10 seconds.
I think that is something wrong with server (may be sever loop), but i dont know how to improve this.

I'm sorry to say this is not going to work because for every user, your server has an open thread. That's going to get real heavy really fast.
It would be better to just poll the server ie. once per minute, and then have the server handle that request only once and not keep it open.
It is possible though, just not with PHP. Check out http://cometd.org/ for example.

Although there are other options available, if you go with polling, there are a few things you need to fix:
Don't use an interval for your polling, instead use a timeout that you set in the success function of the ajax request. That way you don't have to worry about requests overlapping.
As the javascript is doing the polling, don't use a loop in your php to poll for new database entries. Just check once and return the outcome to the browser.
Note that no change in the database is a normal, successful database
query so you can just send back an empty result. I would return json
with a list of database entries or an empty list if none are found and process these results in javascript.
In short, just do a quick check in your database in php and handle the rest in javascript.

Related

Repetitive Notif Checking triggers Error 508 (Loop Detected)

I call an AJAX to check DB if there is new notif every 3 or 10 seconds with the same query from 4 different browsers at the same time. But at some point after loop 100+, the server returns Error 508 (Loop Detected). This is just simple site so I don't think I need VPS server.
I added timestamp in SELECT as query differentiator, put unset, flush, mysqli_free_result, pause, mysqli_kill, mysqli_close, but error still occurs. Entry Processes hit 20/20.
Script
var counter = 1;
var notiftimer;
$(document).ready(function() {
ajax_loadnotifs();
});
function ajax_loadnotifs() {
$.ajax({
type: "post",
url: "service.php",
dataType: "json",
data: { action:'loadnotifs' },
success: function(data, textStatus, jqXHR){
$("div").append($("<p>").text(counter++ + ": succeeded"));
notiftimer = setTimeout(function() {
ajax_loadnotifs();
}, 3000);
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(jqXHR.responseText);
}
});
}
service.php
$link = mysqli_connect('localhost', 'root', 'root', 'testdb');
$notifs = array();
$query = "SELECT id, message FROM notifs LIMIT 20";
if (!$temp_notifs = mysqli_query($link, $query)) {
die(json_encode(array("errmsg" => "Selecting notifs.")));
}
while($notif = mysqli_fetch_assoc($temp_notifs)) {
$notifs[] = $notif;
}
mysqli_close($link);
echo json_encode($notifs);
cPanel - Resource Usage Overview
When Entry Processes hits 20/20, I get Error 508. How to maintain low server Entry Processes? (Tested with 4 different browsers, run them all until loop 100+ on shared hosting. No issue on local computer)
What is considered an Entry Processes?
An "Entry Process" is how many PHP scripts you have running at a single time.
Source: https://billing.stablehost.com/knowledgebase/186/What-is-considered-an-Entry-Processes.html
So the underlying problem as you've found out is eventually you are running too many processes at the same time. There are a few things you can do to solve the issue.
Option 1
Find a new web host. This is perhaps the simplest but also the most costly depending on what sort of financial arrangement you have with your current host. Find one that does not have this restriction.
Option 2
Increase the time between ajax requests. Why do you need to request every 3 seconds? That is a very, very short amount of time. What about 15 seconds? Or 30 seconds? Or heck, even 1 minute? Your users probably don't need their data refreshed as often as you think.
Option 3
Only perform the ajax call if the current tab/window is in focus. There's no reason to keep polling for notifications if the user isn't even looking at your page.
Check out Document.hasFocus():
https://developer.mozilla.org/en-US/docs/Web/API/Document/hasFocus
Option 4
Implement a caching layer. If you feel like you still need to request data very, very often then improve how quickly you retrieve this data. How you implement caching is up to you but in some cases even using a file write/read can reduce the amount of time and resources needed to fulfill the request.
After you get the notifications from the database simply save the JSON into a text file and have subsequent resquests delivered from there until the database data changes. See if this improves performance.
If you want to get even more focused on caching you can look at options like Memcached (https://en.wikipedia.org/wiki/Memcached) or Redis (https://en.wikipedia.org/wiki/Redis).
Try combining multiple options for even better performance!
Turns out that using https instead of http and AJAX 'get' method instead of 'post' prevent this error.

Long polling with Ajax with sleep()/time_sleep_until() in while() loop

I'm willing to set up a long polling Ajax call to check for orders in my e-commerce web app. There is a specificity in this application in the way that customers are able to place order in the future. As such, in the admin panel, we have past orders and futures orders (that can be 2 months or 20 minutes in the future).
Basically, I want the admin user in the back-end to be warned as soon as a future order comes to an end (the future date reaches the current time). To proceed, I make the user admin doing an Ajax call (as soon as they are connected to the admin) to the server to check for futures orders to arrive. This Ajax call is a long polling request as the call waits for the server to deliver result. If server has nothing to offer, the request keeps pending until there is an order to show.
Ajax request
(function poll() {
setTimeout(function() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids) alert('New order!'); // I've simplified this part of the code to make it clean, admin are actually warned through Node.JS server
},
error: function() {},
complete: poll
});
}, 5000);
})();
{{ path('commande_check') }} (edited from Edit2)
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$response = new Response();
$em = $this->getDoctrine()->getManager();
$ids = array();
while(!$ids)
{
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
break;
else
time_sleep_until(time() + self::SECONDS_TO_SLEEP);
}
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
findNewestOrder() method
public function findNewestOrders(\DateTime $datetime)
{
$query = $this->createQueryBuilder('c')
->select('c.id')
->leftJoin('Kt\PaymentBundle\Entity\Paiement', 'p', \Doctrine\ORM\Query\Expr\Join::WITH, 'p.id = c.paiement')
->andWhere('p.etat = 0')
->where("DATE_FORMAT(c.date, '%Y-%m-%d %H:%i') = :date")
->setParameter('date', $datetime->format('Y-m-d H:i'))
->andWhere('c.kbis IS NULL')
->andWhere('c.notif = 0')
->getQuery();
return $query->getArrayResult();
}
My problem is the alert sometimes never get shown whereas the record in the DB gets updated. The weirdest things is it sometimes happens even when I've leaved the page making the Ajax call like if it keeps running in the background. I think the problem comes from the time_sleep_until() function. I tried with sleep(self::SECOND_TO_SLEEP) but the problem was the same.
Any help would by gladly appreciated. Thanks!
Edit 1
I sense there is something to do with connection_status() function as the while loop appears to continue even if the user has switched page causing the field notif to be updated in the background.
Edit 2
As per my answer, I've managed to overcome this situation but the problem still remains. The admin does get the notification properly. However, I do know the Ajax call still keeps going on as the request has been made.
My problem is now: could this result in a server resources overload?
I'm willing to start a bounty on this one as I'm eager to know the best solution to achieve what I want.
I think I got it all wrong.
The intent of long-polling Ajax is not that there is only one connection that stays opened such as websockets (as I thought it did). One would have to make several requests but much less than regular polling.
Regular polling
The intent for Ajax regular polling is one makes a request to the server every 2 or 3 seconds to have a semblance of real-time notification. These would result in many Ajax calls during one minute.
Long polling
As the server is waiting for new data to be passed on to the browser, one would need to make only a minimal number of requests per minute. As I'm checking in the database for new order every minute, using long polling can make me lower the number of requests per minute to 1.
In my case
In consequence, the specificity of the application makes the use of Ajax long-polling unnecessary. As soon as a MySQL query has been made for a specific minute, there is no need for the query to run again in the same minute. That means I can do regular polling with an interval of 60000 ms. There's also no need to use sleep() nor time_sleep_until().
Here's how I ended up doing it:
JS polling function
(function poll() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids)
alert('New orders');
},
error: function() {},
complete: function() {
setTimeout(poll, 60000);
}
});
})();
{{ path('commande_check') }}
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$em = $this->getDoctrine()->getManager();
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response = new Response();
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
As such, I end up with one request per minute that will check for new orders.
Thanks #SteveChilds, #CayceK and #KevinB for their kind advice.
In general for this problem it is kinda rough to say. We don't have a lot of information as to exactly what your other functions do. Like findNewestOrders...
We can assume that it pulls all new orders that have yet to be fulfilled by the admin and therefore will be displayed. However, if it is looking only for orders that are exactly equal they will never be filled.
Theoretically this will run forever if no new order ever is filed. You have no time limit on this so it is possible that the server feels like you have a case in which while will never be false and executes an exceeded execution time.
As per your comment
time_sleep_until
Returns TRUE on success or FALSE on failure.
The only way it would ever fail is if the function itself failed or some server side issue caused a failure return. As you never officially visit the page and no act of leaving your ajax'd page submits a failure response it should never really fail.
I think it might be more wise to look into doing a CRON job for this and have a database of incomplete orders that you query instead. The CRON can run every minute and populate the database. The run on the server would not be that great as it would most likely take no more than 30 seconds any way.
Long-polling may be a great idea for many functions, but I'm not wholly confident it is in the case. I would seriously recommend setInterval as the load on the server and client would not be that great in a 30 seconds call every minute or two. That is your call in the end.
I personally would check frequently rather than have one request which runs for a long time - its not really ideal to have long running processes like this as they tie up server connections and really, its just bad practice. Plus the browser may well time the connection out, which is why you may not be seeing the responses you expect.
Have you tried changing the ajax call so it calls in say, every 60 seconds (or however often you want), checks for new orders since the last time it was polled (simply keep a track of this in the page / HTML5 local storage so it persists across pages and pass it in the ajax request as a parameter) and then simply returns an indication of yes there have been new orders, or no there hasn't?
You can then display a message if there have been new orders.
I have finally managed to overcome this bug but without digging deeply in the problem.
I have separated the code that updates the notif field from the code that fetch new orders. In that way, the while loop still goes on but cannot update the field.
The field is therefore updated on success of the first ajax call by making a new ajax request to update the field. Therefore, the admin always receives the notification.
I just have to enquiry on a memory/thread level to see what consumption of resources this loop uses.
As no solution has been found despites my workaround for the initial bug, I won't accept my answer as the problem remains still.
Many thanks for all the help on that question.

Polling Apache server not responding

The apache server I am using to develop my system will not respond to request while the scripts that control the polling of messages is being run. This only happends on a domain level meaning that I can send an http request to any other apps hosted localy and get a response. When I do eventually get a response from this its about a minute later.
Here is the Js
window.fetch_messages = function ()
{
var last_message = $("div.message:last").attr('data-ai_id');
var last_message_status = $("p.message_status:last").text();
var project_id = getParameterByName('project-id');
$.ajax({
url:'/project_messages',
type:'POST',
data:{ project_id:project_id, latest_message:last_message, status:last_message_status },
timeout:50000,
async: true,
success:new_messages, // This upon completion also resends the request
error:function(data){ console.log(data); setTimeout(fetch_messages(),50000); }
});
}; // When On the page that uses this I call this function to start polling
Here is the server side code
do
{
// Check for status change
$status_change = $this->mentor_model->query_status($this->project_id, $this->last_message_id, $this->last_message_status, $_SESSION['user']);
// Check for new messages
$messages = $this->mentor_model->query_messages($this->project_id, $this->last_message_id);
// If there is a status update or new message.
if($messages || $status_change)
break;
usleep(1000000);
}
while(empty($messages) && empty($status_change));
echo json_encode(array("messages"=>$messages, "status"=>$status_change));
exit;
While this action is being run The server takes a long time to handle any request weather it be a GET, POST or another AJax request. Iv also tried changing both code sets to no avail as long as its long polling, the server will take a long time to handle.
Do I have this wrong or is there some apache setting I'm suppose to change. Using xamp on windows 8.1 also tried wamp with no change
Thanks to steven for this. Ansewer taken straight from the source of php manual page
for session_write_close();
You can have interesting fun debugging anything with sleep() in it if
you have a session still active. For example, a page that makes an
ajax request, where the ajax request polls a server-side event (and
may not return immediately).
If the ajax function doesn't do session_write_close(), then your outer
page will appear to hang, and opening other pages in new tabs will
also stall.

php and ajax: show progress for long script

I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];

Delete an entry from db on PHP page exit

I have a php page in which a user tries to find online people.
There a search button, clicking on which, an entry is made for the current user in the database and the control goes inside a loop, where every 5 secs a search is made in the database to find if a new entry has been made, if an entry is found then the details of the partner is shown to him.
I want that if the user exits or navigates away from the page before a partner is being found, then his entry must be deleted from the db.
I am trying to store the 'id' created against the user inside a session variable, make an ajax call and delete the entry, but somehow this concept is not working.The data is not getting deleted. Is this because of the loop which is still finding the user or something else, m not able to get it.
Can anyone tell me what is going wrong with my approach ?
A code snippet that I am using is hereby
window.onbeforeunload = function() {
funcDeleteonexit();
return "Are you sure you want to navigate away?";
}
function funcDeleteonexit(){
$.get("functions.php",{
data:"delete"
},function(data,status){
});
}
Inside my functions.php, I have
if($_GET){
if ($_GET['data']=="delete"){
echo deletefromDb();
}
}
function deletefromDb() {
$mysqli = new mysqli("localhost", "root", "", "test");
/* check connection */
if (mysqli_connect_errno())
{
printf("Connect failed: %s\n", mysqli_connect_error());
exit();
}
$currentid = (int)$_SESSION['currentId'];
$query1 = "delete from test where id =". $currentid;
$mysqli->query($query1);
$mysqli->close();
return $currentid;
}
If you want something done when someone exits the page, this will in most cases give you trouble if you want a specific event to be fired then. There will be cases a browser cannot fire the event (p.e. client crash etc.).
Therefore I'd suggest a different approach. Control this from the server side, and don't rely on the users browser or input.
Firstly, you can poll your data while regularly firing Ajax events from the browser. You've not stated this, but I think you do something like this:
function yourrequest(){
$.ajax({
type: "GET",
url: "your/php/script.php",
success: function(data){
// do something, p.e. update a div
}
});
};
setInterval(yourrequest, 5000);
(Please don't tear this code sample apart, its just to show the basics).
On the server side, you already have the users id somewhere in your database. Add a timestamp field to this table (call it last_request or something like that). Everytime you send some data to the client, check if there are users with a last_request below your desired threshold, and delete those ids. No need to wait for a specific event then.
This needn't be done exactly there, you can also do something different, p.e. a cleanup job cronned every 5 minutes or so which does this seperately to not disturb the users request at all.
I'm suggesting that browser just not sending ajax, or interupting it.
So try to make your AJAX request synchronous. For JQuery seems to be set async parameter to false
$.get("functions.php",{
data:"delete",
async : false
},function(data,status){
});

Categories