I call an AJAX to check DB if there is new notif every 3 or 10 seconds with the same query from 4 different browsers at the same time. But at some point after loop 100+, the server returns Error 508 (Loop Detected). This is just simple site so I don't think I need VPS server.
I added timestamp in SELECT as query differentiator, put unset, flush, mysqli_free_result, pause, mysqli_kill, mysqli_close, but error still occurs. Entry Processes hit 20/20.
Script
var counter = 1;
var notiftimer;
$(document).ready(function() {
ajax_loadnotifs();
});
function ajax_loadnotifs() {
$.ajax({
type: "post",
url: "service.php",
dataType: "json",
data: { action:'loadnotifs' },
success: function(data, textStatus, jqXHR){
$("div").append($("<p>").text(counter++ + ": succeeded"));
notiftimer = setTimeout(function() {
ajax_loadnotifs();
}, 3000);
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(jqXHR.responseText);
}
});
}
service.php
$link = mysqli_connect('localhost', 'root', 'root', 'testdb');
$notifs = array();
$query = "SELECT id, message FROM notifs LIMIT 20";
if (!$temp_notifs = mysqli_query($link, $query)) {
die(json_encode(array("errmsg" => "Selecting notifs.")));
}
while($notif = mysqli_fetch_assoc($temp_notifs)) {
$notifs[] = $notif;
}
mysqli_close($link);
echo json_encode($notifs);
cPanel - Resource Usage Overview
When Entry Processes hits 20/20, I get Error 508. How to maintain low server Entry Processes? (Tested with 4 different browsers, run them all until loop 100+ on shared hosting. No issue on local computer)
What is considered an Entry Processes?
An "Entry Process" is how many PHP scripts you have running at a single time.
Source: https://billing.stablehost.com/knowledgebase/186/What-is-considered-an-Entry-Processes.html
So the underlying problem as you've found out is eventually you are running too many processes at the same time. There are a few things you can do to solve the issue.
Option 1
Find a new web host. This is perhaps the simplest but also the most costly depending on what sort of financial arrangement you have with your current host. Find one that does not have this restriction.
Option 2
Increase the time between ajax requests. Why do you need to request every 3 seconds? That is a very, very short amount of time. What about 15 seconds? Or 30 seconds? Or heck, even 1 minute? Your users probably don't need their data refreshed as often as you think.
Option 3
Only perform the ajax call if the current tab/window is in focus. There's no reason to keep polling for notifications if the user isn't even looking at your page.
Check out Document.hasFocus():
https://developer.mozilla.org/en-US/docs/Web/API/Document/hasFocus
Option 4
Implement a caching layer. If you feel like you still need to request data very, very often then improve how quickly you retrieve this data. How you implement caching is up to you but in some cases even using a file write/read can reduce the amount of time and resources needed to fulfill the request.
After you get the notifications from the database simply save the JSON into a text file and have subsequent resquests delivered from there until the database data changes. See if this improves performance.
If you want to get even more focused on caching you can look at options like Memcached (https://en.wikipedia.org/wiki/Memcached) or Redis (https://en.wikipedia.org/wiki/Redis).
Try combining multiple options for even better performance!
Turns out that using https instead of http and AJAX 'get' method instead of 'post' prevent this error.
Related
I'm willing to set up a long polling Ajax call to check for orders in my e-commerce web app. There is a specificity in this application in the way that customers are able to place order in the future. As such, in the admin panel, we have past orders and futures orders (that can be 2 months or 20 minutes in the future).
Basically, I want the admin user in the back-end to be warned as soon as a future order comes to an end (the future date reaches the current time). To proceed, I make the user admin doing an Ajax call (as soon as they are connected to the admin) to the server to check for futures orders to arrive. This Ajax call is a long polling request as the call waits for the server to deliver result. If server has nothing to offer, the request keeps pending until there is an order to show.
Ajax request
(function poll() {
setTimeout(function() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids) alert('New order!'); // I've simplified this part of the code to make it clean, admin are actually warned through Node.JS server
},
error: function() {},
complete: poll
});
}, 5000);
})();
{{ path('commande_check') }} (edited from Edit2)
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$response = new Response();
$em = $this->getDoctrine()->getManager();
$ids = array();
while(!$ids)
{
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
break;
else
time_sleep_until(time() + self::SECONDS_TO_SLEEP);
}
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
findNewestOrder() method
public function findNewestOrders(\DateTime $datetime)
{
$query = $this->createQueryBuilder('c')
->select('c.id')
->leftJoin('Kt\PaymentBundle\Entity\Paiement', 'p', \Doctrine\ORM\Query\Expr\Join::WITH, 'p.id = c.paiement')
->andWhere('p.etat = 0')
->where("DATE_FORMAT(c.date, '%Y-%m-%d %H:%i') = :date")
->setParameter('date', $datetime->format('Y-m-d H:i'))
->andWhere('c.kbis IS NULL')
->andWhere('c.notif = 0')
->getQuery();
return $query->getArrayResult();
}
My problem is the alert sometimes never get shown whereas the record in the DB gets updated. The weirdest things is it sometimes happens even when I've leaved the page making the Ajax call like if it keeps running in the background. I think the problem comes from the time_sleep_until() function. I tried with sleep(self::SECOND_TO_SLEEP) but the problem was the same.
Any help would by gladly appreciated. Thanks!
Edit 1
I sense there is something to do with connection_status() function as the while loop appears to continue even if the user has switched page causing the field notif to be updated in the background.
Edit 2
As per my answer, I've managed to overcome this situation but the problem still remains. The admin does get the notification properly. However, I do know the Ajax call still keeps going on as the request has been made.
My problem is now: could this result in a server resources overload?
I'm willing to start a bounty on this one as I'm eager to know the best solution to achieve what I want.
I think I got it all wrong.
The intent of long-polling Ajax is not that there is only one connection that stays opened such as websockets (as I thought it did). One would have to make several requests but much less than regular polling.
Regular polling
The intent for Ajax regular polling is one makes a request to the server every 2 or 3 seconds to have a semblance of real-time notification. These would result in many Ajax calls during one minute.
Long polling
As the server is waiting for new data to be passed on to the browser, one would need to make only a minimal number of requests per minute. As I'm checking in the database for new order every minute, using long polling can make me lower the number of requests per minute to 1.
In my case
In consequence, the specificity of the application makes the use of Ajax long-polling unnecessary. As soon as a MySQL query has been made for a specific minute, there is no need for the query to run again in the same minute. That means I can do regular polling with an interval of 60000 ms. There's also no need to use sleep() nor time_sleep_until().
Here's how I ended up doing it:
JS polling function
(function poll() {
$.ajax({
url: '{{ path('commande_check') }}',
method: 'post',
success: function(r) {
if(r.ids)
alert('New orders');
},
error: function() {},
complete: function() {
setTimeout(poll, 60000);
}
});
})();
{{ path('commande_check') }}
public function checkAction(Request $request)
{
if($request->isXmlHttpRequest())
{
$em = $this->getDoctrine()->getManager();
$ids = $em->getRepository('PaymentBundle:Commande')->findNewestOrders(new \DateTime());
if($ids)
{
return new JsonResponse(array(
'ids' => $ids
));
}
$response = new Response();
$response->setStatusCode(404);
return $response;
}
$response = new Response();
$response->setStatusCode(405);
return $response;
}
As such, I end up with one request per minute that will check for new orders.
Thanks #SteveChilds, #CayceK and #KevinB for their kind advice.
In general for this problem it is kinda rough to say. We don't have a lot of information as to exactly what your other functions do. Like findNewestOrders...
We can assume that it pulls all new orders that have yet to be fulfilled by the admin and therefore will be displayed. However, if it is looking only for orders that are exactly equal they will never be filled.
Theoretically this will run forever if no new order ever is filed. You have no time limit on this so it is possible that the server feels like you have a case in which while will never be false and executes an exceeded execution time.
As per your comment
time_sleep_until
Returns TRUE on success or FALSE on failure.
The only way it would ever fail is if the function itself failed or some server side issue caused a failure return. As you never officially visit the page and no act of leaving your ajax'd page submits a failure response it should never really fail.
I think it might be more wise to look into doing a CRON job for this and have a database of incomplete orders that you query instead. The CRON can run every minute and populate the database. The run on the server would not be that great as it would most likely take no more than 30 seconds any way.
Long-polling may be a great idea for many functions, but I'm not wholly confident it is in the case. I would seriously recommend setInterval as the load on the server and client would not be that great in a 30 seconds call every minute or two. That is your call in the end.
I personally would check frequently rather than have one request which runs for a long time - its not really ideal to have long running processes like this as they tie up server connections and really, its just bad practice. Plus the browser may well time the connection out, which is why you may not be seeing the responses you expect.
Have you tried changing the ajax call so it calls in say, every 60 seconds (or however often you want), checks for new orders since the last time it was polled (simply keep a track of this in the page / HTML5 local storage so it persists across pages and pass it in the ajax request as a parameter) and then simply returns an indication of yes there have been new orders, or no there hasn't?
You can then display a message if there have been new orders.
I have finally managed to overcome this bug but without digging deeply in the problem.
I have separated the code that updates the notif field from the code that fetch new orders. In that way, the while loop still goes on but cannot update the field.
The field is therefore updated on success of the first ajax call by making a new ajax request to update the field. Therefore, the admin always receives the notification.
I just have to enquiry on a memory/thread level to see what consumption of resources this loop uses.
As no solution has been found despites my workaround for the initial bug, I won't accept my answer as the problem remains still.
Many thanks for all the help on that question.
I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];
I am trying to make a notification system of incoming messages. For examlpe user stays on the page and he recieves in a block that he has 2 unread messages.
My client code:
(function getmess(){
var id = '<?=$MY_ID;?>';
$.ajax({
url:"notif.php",
data:{"id":id,},
type:"GET",
success: function(result){
$("#count").html(result);
}, dataType: "json",
complete: getmess,
timeout: 10000});
})();
My server code:
<?php
$mysqli = new mysqli('localhost', 'root', '', 'lc');
if (mysqli_connect_errno()) {
printf("error: %s\n", mysqli_connect_error());
exit;
}
$MY_ID = $_POST['id'];
while (true) {
$result = $mysqli->query("SELECT COUNT(*) FROM messages WHERE user_get='$MY_ID' AND status='0' ");
if (mysqli_num_rows($result)) {
while ($row = mysqli_fetch_array($result)) {
echo $row[0]."";
}
exit;
}
sleep(5);
}
Everything works, but I have the problem that ajax requests are sent every 1 second and the lengh of these request is 1 second, and they overload the server.
I want to wait for a response from the server for at least 10 seconds in the case of a successful response to immediately send a new request, as well if the server does not respond within 10 seconds, ie in the database there is no change, then send a new query length of 10 seconds.
I think that is something wrong with server (may be sever loop), but i dont know how to improve this.
I'm sorry to say this is not going to work because for every user, your server has an open thread. That's going to get real heavy really fast.
It would be better to just poll the server ie. once per minute, and then have the server handle that request only once and not keep it open.
It is possible though, just not with PHP. Check out http://cometd.org/ for example.
Although there are other options available, if you go with polling, there are a few things you need to fix:
Don't use an interval for your polling, instead use a timeout that you set in the success function of the ajax request. That way you don't have to worry about requests overlapping.
As the javascript is doing the polling, don't use a loop in your php to poll for new database entries. Just check once and return the outcome to the browser.
Note that no change in the database is a normal, successful database
query so you can just send back an empty result. I would return json
with a list of database entries or an empty list if none are found and process these results in javascript.
In short, just do a quick check in your database in php and handle the rest in javascript.
I am developing functionality similar to an auction, using jQuery, AJAX, PHP and mySQL.
Ajax accesses the server every second to get the most recent bid, and during this call we also get the remaining time from the server to keep all participants in sync.
I have two issues:
1) Occasionally the time remaining flickers back to the value of the previous time for a fraction of a second. Could this is to do with the asynchronous results getting out of sync?
Snippets of the relevant code:
function dotimer() {
updateScreen();
setTimeout('dotimer()',1000);
}
function updateScreen(){
$.ajax({
type : 'POST',
url : 'getinfo.php',
dataType : 'json',
data: { /* various params are passed to php */ },
success : function(data){
/* other info processed here...*/
$("#countdowntimer").html(data.secondsremaining);
},
error : function(XMLHttpRequest, textStatus, errorThrown) {}
});
}
getinfo.php:
$return['secondsremaining'] = strtotime($end_time)-strtotime("now");
/* get other infor from database... */
echo json_encode($return);
(setTimeout and setInterval both had the same results.)
2) Is accessing the database every second excessive? I can't see an alternative to ensure information is up-to-date. Is there a better way to do this?
The auction is for a relatively short period of time (30 min) and we do not expect any more than 10 participants.
Any advice/suggestions welcome, thanks!
I think that's exactly your problem. As the requests are asynchronous, you cannot control the order that they are executed. You have to synchronize your requests and avoid multiple requests ie you can only do a new request if there aren't pending requests, else you cannot control when the callback for each request is fired.
Is it possible for javascript to update automatically if a mySQL field is modified? I'm assuming this basically translates to some kind of a constant query of a specific SQL record.
For an example, lets suppose I'm making a simple /multiplayer/ tic-tac-toe using PHP and jquery with a mySQL background.
I want the tic-tac-toe page to be powered by jquery so that the user does not have to do a page refresh.
Two users get hooked up and the game begins. User 2 waits while user 1 thinks about where to put an X. When User 1 clicks a square to include an X, I'd like for user 2 to have their screen automatically changed to reflect - without having to press any buttons to check for updates.
You could poll the server every X seconds (say 10 seconds) via AJAX to check if it's changed. Theres no (easy) way of pushing data to the client side.
Example code:
function checkStatus() {
setTimeout('checkStatus()',10000);
$.ajax({
url: "checkStatus.php",
success: function(data){
//Code to handle change goes here
}
});
}
setTimeout('checkStatus()',10000);
You have two main solutions, polling or websockets(which isn't fully supported in all browsers btw), both involve communicate with the backend. I'm not going to cover websockets, however it is an up and coming technology that keeps an open connections from the frontend to the backend. Another option is using something like Comet, which allows you to keep an HTTP connection open for a long time.
The other solution you have is polling, in which you make an ajax request every x seconds to "poll" for changes.
This could be done using Ajax (use JQuery Ajax). It could refresh every couple of seconds to update the page with the latest content from the database. This approach is fine on a small scale with a low a number users but is very draining on your server's resources as it is constantly sending and receiving data even if new data is not available.
A better option may be to use node.js and socket.io to support large scale real time processes.
Yes - one way is to use a long-polling comet. Essentially, this works by making an asynchronous (typically with AJAX) request from the server and waiting for the response. The wait can be for an hour, say. When it receives a response, it 'completes' the request and then sends another request in the same way.
There are lots of other ways though - check out 'push technology'.
I'd been trying to use setTimeout but had no success. I used setInterval and it seems to work like a charm.
Code follows:
function waitForMsg(){
$.ajax({
url: "tictac_code1.php",
type: 'POST',
data: 'longpoll=1',
async: true, /* If set to non-async, browser shows page as "Loading.."*/
cache: false,
timeout:10000, /* Timeout in ms */
success: function(data){ /* called when request to barge.php completes */
$('#loggedinnames').empty();
$('#loggedinnames').append(data);
setInterval(waitForMsg, 10000);
//setTimeout(
// 'waitForMsg()', /* Request next message */
// 1000 /* ..after 1 seconds */
//);
},
error: function(XMLHttpRequest, textStatus, errorThrown){
//alert("error in waitformsg.");
addmsg("error", textStatus + " (" + errorThrown + ")");
setInterval(waitForMsg, 10000);
//setTimeout(
// 'waitForMsg()', /* Try again after.. */
// "15000"); /* milliseconds (15seconds) */
}
});
};
$(document).ready(function(){
waitForMsg(); /* Start the inital request */
});