Ajax Loop Overwhelms PHP Script - php

This works great at retrieving the php data for say 15 passes BUT when the json file is say 100 items it chokes the php script and creates random errors. My guess is because the requests are all made from this singe ajax request (faster than the php script works) the php script is getting confused?
$(document).ready(function(){
var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching list...</div>";
$("#status").html(ajax_load);
$.getJSON('/fsbo/get_urls_24_hours', function(data) {
$("#alias").fadeOut('slow');
var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching property...</div>";
$('#props').html('');
$.each(data, function(key, val) {
$.ajax({
type: "POST",
url: base_url + "/fsbo/get_property",
data: "url="+ val,
cache:false,
success:
function(data){
$("<div></div>").html(data).appendTo('#props');
}
});
});
});
});
As a side note where do I put the hide loading gif? Putting it at the end of the loop does no good it just opens and closes not waiting for the return of data.

It's generally a bad idea to makes AJAX requests in a loop. Why not just modify the original call to return all of the data you want in your JSON rather than making 100 calls.
If for some reason you can't avoid this, constrain the number of pending requests. Send, for example, the first 5 requests, then only send the 6th once you get a response from one of the first 5. This way only 5 requests are pending at any time and your server isn't hit with 100 all at once.

This code snipped is killer.
$.each(data, function(key, val) {
$.ajax({
If the length of data is 100 there will be 100 http connections to your server. This will obviously choke your server. Besides, your browser will become slow. Its like opening 100 tabs in Firefox one shot.
Pass all the data at a single Ajax request. If the size is huge send chunk by chunk. When you receive first response send the request for next chunk. But don't send them simultaneously.

I do think that you have the right answer, that it is because of the call.
The way PHP calls work, is by creating a seperate call to your system, while also invoking various other libraries, therefore invoking a sort of memory leak, which is exponentially increasing the time and resources required.
What I suggest is - pass all of your variables to PHP and let it do the work, then receive a JSON object back, and parse it.
It may be a bit slower to the end user, but should help avoid this from happening.
P.S.
I've had similiar issues, when these kinds of calls were making so many requests for one user, that the whole webserver crashed.

Related

how to show Ajax request progress

I have this ajax request to update my db.
function ajax_submit(){
var submit_val=$("#stato").serialize();
dest="plan/new_bp1.php";
$.ajax({
type: "POST",
url: dest,
data: submit_val,
success: function(data){
data1=data.split("|");
if(data1[0]=="Successo"){
$("#spnmsg").fadeTo(200,0.1,
function(){$(this).removeClass().addClass("spn_success").html(data1[1]).fadeTo(900,1)});
}else if(data1[0]=="Errore"){
$("#spnmsg").fadeTo(200,0.1,
function(){$(this).removeClass().addClass("spn_error").html(data1[1]).fadeTo(900,1)});
}
},
complete: function(){
setTimeout(function(){ $('.container').load('plan/home.php');},2000);
}
});
}
The called script will take long to perform since it has to select, elaborate and insert around 4.000 recors each time. What I do now is to add a spinner on the screen to give users a feedback that the request is working (triggered by AjaxStart and AjaxStop).
At the end of the call the complete function will echo what the php script will echo.
What I'd like to do is to have a counter that will update during the script execution where I can say something like "X records out of 4.000 processed"
On the script side I have no problem to calculate the number of processed records and the number of record overall.
How can I update my script to show the progress?
You have a couple of options.
1) Right when the request starts you can start polling a different endpoint that just serves out the number of records that have been processed, like #adeneo suggested. You don't have to write to a file every time a record is processed. You can just store it in memory and ensure that the route handler you have to handle requests coming in for the progress has access to that same memory.
2) Implement a websocket endpoint on your server that pushes out the number of processed records. Basically, on the server then, you will call that Websocket library code to push out the progress. You will also have to create a Websocket connection on the Javascript side, which is trivial, and listen for those messages.

php and ajax: show progress for long script

I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];

Is it possible to 'echo' a sequence of responses from an ajax call

I'm learning and experimenting with jquery/ajax as I develop a website. I have a page that updates a database. I would like a 'sequence' of responses to display on the screen when the user submits their data (I've seen this done on many other sites).
For instance... user submits form and the page displays:
Received Input
Checking Database - recond number xy
Updating Database
Retrieving Information
etc etc
This is just an example but you get the idea.
I have an ajax call that is initiated on 'click' of the submit button (getFormData just serialises all the form data for me and works fine):
var toSend = getFormData($upgrade_db_form);
var formMessageBox = $("#displayResults");
$.ajax({
url: ajaxurl,
data: {
action: "database_action",
formData : toSend
},
type: 'POST',
dataType: 'TEXT',
beforeSend: function() {
//$form.fadeOut('slow');
formMessageBox.html("starting it up");
},
success: function (data) {
formMessageBox.empty();
formMessageBox.html(data);
error: function (xhr) {
// formMessageBox.html("Oops error!");
}
});
I have a function which gets called by ajax:
function upgrade_database_function() {
echo "testing ";
for($i = 0; $i < 99; $i++) {
echo "percent complete ".$i."%";
}
echo "done ";
die(); // this is required to return a proper result
}
I'm using Wordpress and all the ajax side of things works fine, it passes the form data correctly etc, it's just that I get one long output as though it's cache'ing all the echo's up instead of outputting them in sequence.
I've gone through the jquery ajax documentation and couldn't find how to make it behave the way I want it to. I can live with it the way it is but I think it would look a lot better if I could get it working the way I would like.
Can this be done this way or do I need lots of sequential ajax calls to make it work?
I don't know PHP, but i'm guessing that the echo is just writing to the response buffer... so when all the echos are done the whole response will be returned and you would get the effect that you are seeing... You would need to go with a polling system or something along those lines to get the latest status' from the server and display them I would think... Maybe there is some system in PHP that allows this, but as I said, I don't know PHP.
An Example of Long Polling can be found in this article.
http://www.abrandao.com/2013/05/11/php-http-long-poll-server-push/
WARNING: You may have to do some manual managing of locking of the session in PHP so that your long running call doesn't lock your polling ajax calls: See here:
http://konrness.com/php5/how-to-prevent-blocking-php-requests/
Note that you would likely be wanting to:
create one ajax call that starts the execution of some coded that will take a while... you could put messages that have been generated into a session variable for example in a list of some sort. You would need to lock/unlock the session as mentioned to prevent suspension of AJAX polling calls.
you would create a polling method like in the article that might check the session every 500ms or something to see whether there are any more messages, lock the session, remove those messages and return those messages to the client side and display them.
WARNING: Again, I'm not a PHP person, I may have done this once in my life in PHP (can't remember exactly) but I may be wrong and this may not work, from what I've seen though it looks like it is achievable. Hope this gets you on your way!

jquery $.ajax request remains pending

I have made a simple chat application which uses long-polling approach using jquery,
function sendchat(){
// this code sends the message
$.ajax({
url: "send.php",
async: true,
data: { /* send inputbox1.value */ },
success: function(data) { }
});
}
function listen_for_message(){
// this code listens for message
$.ajax({
url: "listen.php",
async: true,
timeout:5000,
success: function(data) { // the message recievend so display that message and make new request for listening new messages
$('#display').html(data);
listen_for_message();
}
});
}
THIS SHOULD HAPPEN : after page loaded the infinite request for listen.php occurs and when user sends message, the code sends message to database via send.php.
PROBLEM is, using firebug i've found that send.php request which is performed after listen.php request, is remains pending. means the request for send message is remains pending.
The issue was because of session locking;
both send.php and listen.php files use session variables,
so session is locked in listen.php file and the other file (here send.php file) can't be served after the session frees from serving another file ( here listen.php).
How do I implement basic "Long Polling"?
the link above is a similar question that may help you.
it does not have to be on a database, it can be saved on a tmp file, but your problem is that you are choking the browser by performing too many requests, any one browser handles two requests at a time, which means you should really allow the browser to finish the first requests first then do the second one... and so on...
you do not need to do send.php and listen.php, because you can do it simply on one page both of them.
function check(){
$.ajax({
url : 'process.php',
data : {msg:'blabla'/* add data here to post e.g inputbox1.value or serialised data */}
type : 'post',
success: function (r){
if(r.message){
$('#result').append(r.message);
check();//can use a setTimeout here if you wish
}
}
});
}
process.php
<?php
$msg = $_POST['msg'];//is blabla in this case.
$arg['message'] = $msg;//or grab from db or file
//obviously you will have to put it on a database or on a file ... your choice
//so you can differentiate who sent what to whom.
echo json_encode($arg);
?>
obviously this are only guide lines, but you will exhaust your bandwidth with this method, however it will be alot better because you have only one small file that returns either 0 to 1 byte of information, or more if there is a message posted.
I have not tested this so don't rely on it to work straight away you need a bit of changes to make it work but just helps you understand how you should do it.
however if you are looking for long pulling ajax there are loads of scripts out there already made and fine tuned and have been test, bug fixed and many minds help built it, my advice is don't re-invent the wheel

Endless loop in jQuery and PHP. What should I change to make it work?

<?
if($_POST['begin'])
{
while(1)
{
echo "1";
sleep(2);
}
die();
}
?>
<span class="answer"></span>
<script type="text/javascript">
$(document).ready(function() {
$.ajax({
type: "POST",
url: "exp.php",
data: "begin=1",
success: function(msg){
$(".answer").html(msg);
}
})
})
</script>
It, of course, doesn't work. What should I change to make it work? Can I avoid using setInterval, setTimeout or other functions in javascript?
By the way, what I am trying to do here is to write number 1 every two seconds.
I've never tried it, but the XMLHttpRequest interface supposedly supports streamed requests. In particular there is the .readyState==3 which denotes partial results (See spec http://www.w3.org/TR/XMLHttpRequest/#event-handler-attributes).
When you don't want to set an interval handler, then you will have to override the actual XHR callback, because the jQuery success: will really only fire on completion.
xmlHttp = $.ajax({ ... });
xmlHttp.onreadystatechange = function () {
if (xmlHttp.readyState >= 3) {
alert(xmlHttp.responseText);
}
};
Note that the responseText will always contain the accumulated data. So you have to split it up on \n or something, if you want to read the latest 1.
Ok, I think I know what you want. It's weird .. but fine.
<script type="text/javascript">
$(document).ready(function() {
function fetch_a_one() {
$.ajax({
type: "POST",
url: "exp.php",
data: "begin=1",
success: function(msg){
$(".answer").html(msg);
fetch_a_one();
}
})
}
fetch_a_one();
})
</script>
<div class="answer"></div>
PHP script:
<?php
sleep(2); // way two seconds
exit(1); // print 1
?>
Minus some delay from starting the Ajax request and lag from the server, this should print '1' every 2 seconds .. no idea why you want this.
This is the wrong approach, because your success function isn't going to run until it receives a response from the server which isn't going to happen as it's in an endless loop.
You will need to handle the timings in JavaScript with, as you say, setInterval. Of course if all you're trying to do is simply print the number 1 every five seconds you don't need to make any calls to the server-side (although I presume there's something more you're trying to achieve eventually - you might want to expand on that a little).
You could look into opening a WebSocket back to the server do handle this kind of ongoing communication. Look into something like PusherApp - http://pusher.com/
You're probably going to have to use JavaScript for this. You can continually poll the server resource to get information from it, but the actual looping and delaying will need to be in JavaScript.
The reason for this is because the PHP script needs to finish processing at some point. It's not streaming output to the client, it's building output to send to the client. In the code you provide, it's forever building the output and never sending it.
You can try to flush your buffer from the PHP script to send some output to the client while still building more output, but take a look at the caveats in that link. It's not really a clean way to accomplish this and will probably cause more problems than it solves in this case. At some point the server resource needs to stop processing and commit to a response. Trying to short-circuit that basic concept of HTTP will likely be a bit of a hack.
I think the problem is, that you're writing endless PHP loop.
jQuery when starting ajax request, will be waiting for script to finish his job. However the script would never ends - so the browser will never get the full answer.
You need to use setTimeout, there is no other way - at least no other easy and safe way.

Categories