Never-ending ajax request, good idea / bad idea? - php

For the backend of my site, visible only to a few people, I have a system whereby I communicate with a php via ajax like so:
function ajax(url, opts) {
var progress = false, all_responses = [], previousResponseLength = "";
var ajaxOptions = {
dataType: "json",
type: "POST",
url: url,
xhrFields: {
onprogress: function(e) {
if (!e.target.responseText.endsWith("\n")) return;
var response = e.target.responseText.substring(previousResponseLength).trim();
previousResponseLength = e.target.responseText.length;
var responses = response.split(/[\r\n]+/g);
var last_response;
for (var k in responses) {
if (responses[k] === "---START PROGRESS---") {
progress = true;
if (opts.onProgressInit) opts.onProgressInit();
} else if (responses[k] === "---END PROGRESS---") progress = false;
else all_responses.push(last_response = responses[k]);
}
if (progress && last_response !== undefined) opts.onProgress(JSON.parse(all_responses[all_responses.length-1]));
}
},
dataFilter: function(data){
return all_responses[all_responses.length-1];
}
}
$.extend(ajaxOptions, {
onProgress: function(data){
console.log(data);
}
});
return $.ajax(ajaxOptions);
}
And an example of a never-ending php script (until the user closes the connection):
const AJAX_START_PROGRESS = "---START PROGRESS---";
const AJAX_END_PROGRESS = "---END PROGRESS---";
session_write_close(); //fixes problem of stalling entire php environment while script runs
set_time_limit(0); //allows to the script to run indefinitely
output(AJAX_START_PROGRESS);
while(true) {
output(json_encode(["asdasd" => "asasdas"]));
sleep(1);
}
function output($msg) {
echo preg_replace("`[\r\n]+`", "", $msg).PHP_EOL;
ob_flush(); flush();
}
This allows me through 1 ajax request to 'poll' (am I using that term correctly?)
So if I want to execute a very long php script I can now check its progress, and the last response is delivered via jqhxr.done(callback).
Or, as in the example php script, I can open a connection and leave it open. Using sleep(1); It issues an update to the $.ajax object every 1 second.
Every response has to be json encoded, and if the response is 1 very long json that comes over multiple 'onprogress' calls, it waits until the end of the message (if responseText.endsWith("\n")) we're ready!)
My remote shared server didn't allow websockets so I made this. If the user closes the connection, so does the php script.
It's only got to work for a few admins with special privileges, and I don't need to worry about old browsers.
Can anyone see anything wrong with this script? Through googling I haven't found anybody else with this kind of method, so I expect something is wrong with it.
Extensive testing tells me it works just fine.

You invented long polling request, actually it's wide used as fallback to websockets, so nothing wrong with it.
About your code it's hard to say without testing, but when using such methods as long-polling, you need to double check memory leaks on browser side and on server side.

Related

Running out of HTTP requests with a Ajax function on my server

I don't know so much about servers and HTTP requests limits, in my case I have a Linux Deluxe hosting on GoDaddy, which runs pretty smoothly, but now I would need to understand why my website goes unreachable (ERR_CONNECTION_CLOSED) after launching the following code 2-3 times (the first time it goes fine, the issue comes when I refresh the page 1-2 other times):
My ajax call:
function XSGetPointer(id, tableName) {
var pointer;
var ok = false;
$.ajax({
url : TABLES_PATH + 'm-query.php?',
type: 'POST',
data: 'tableName=' + tableName,
async: false,
success: function(data) {
var results = JSON.parse(data);
for(var i=0; i<results.length; i++) {
if (results[i]['ID_id'] == id ) {
pointer = results[i];
ok = true;
}
if (i == results.length-1 && !ok) {
pointer = null;
}
}
// error
}, error: function(e) {
console.log('XSCurrentUser -> Something went wrong: ' + e.message);
}});
return pointer;
}
PHP for loop where that JS script gets called:
for(var i=0; i<objectsArray.length; i++){
var userPointer = XSGetPointer(objectsArray[i]['PO_userPointer_Users'], 'Users');
$('#queryData').append(
'<p>'
+userPointer['ST_username']+
'<br></p>'
);
}// ./ For • Show results
The for loop above iterates through 57 items (the objectsArray's length). The m-query.php script simply gets all data from a JSON file, like this:
// Get JSON data
$data = file_get_contents($tableName. '.json');
$data_array = json_decode($data, true);
echo json_encode(array_values($data_array), JSON_PRETTY_PRINT | JSON_UNESCAPED_UNICODE | JSON_UNESCAPED_SLASHES);
The support from GoDaddy told me to get a VPS server, but I couldn't understand the real cause of my issue, I suppose is something related to the many HTTP requests I send in the same page, as shown here:
As you can see, my ajax call stops after a few calls, it doesn't get to 57, and the Chrome Console shows:
jquery-3.4.1.min.js:2 POST https://example.com/m-query.php? net::ERR_CONNECTION_CLOSED
I just wanted to know if switching to a VPS server may fix this issue, or if my ajax query is just totally wrong since it's also async: false (I cannot make it true, because it doesn't get JSON data then).
I’ve solved the issue by simply moving my php files into an AWS Lightsail instance

jQuery: Can I send and receive an $.ajax response while a longer $.ajax request is pending?

Situation:
I'm using an $.ajax() POST to send a request to a php script that inserts about 400,000-500,000 lines into a db. This consistently takes about 3.5 - 4 minutes. (During this time, the request is PENDING).
Problem:
I need some way to show progress on the page. (such as a %). I tried using an $.ajax() in a setInterval that checks every 5 seconds or so, but they seem to build up and all come through when the first (longer) $.ajax() is finished.
Question:
Isn't $.ajax() async by default? Shouldn't this mean requests can be sent out in any order and at any time, and responses should be received in any order and at any time?? Does this even have anything to do with async? Is there a way to periodically send back 'semi-responses' from one request? Or can't I send and receive requests/responses while there is a pending request/response? (see awesome drawing below)
Thanks in advance!!!
multiple requests http://kshaneb.com/reqres.png
Your long-running ajax-call probably opens a session on server, so all next requests are blocked due to a session file lock.
Problem:
PHP writes its session data to a file by default. When a request is made to a PHP script that starts the session (session_start()), this session file is locked. What this means is that if your web page makes numerous requests to PHP scripts, for instance, for loading content via Ajax, each request could be locking the session and preventing the other requests from completing.
The other requests will hang on session_start() until the session file is unlocked. This is especially bad if one of your Ajax requests is relatively long-running.
Solution:
The session file remains locked until the script completes or the session is manually closed. To prevent multiple PHP requests (that need $_SESSION data) from blocking, you can start the session and then close the session. This will unlock the session file and allow the remaining requests to continue running, even before the initial request has completed.
More info here:
http://konrness.com/php5/how-to-prevent-blocking-php-requests/
Hope this helps
$(function ()
{
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) clearInterval(i);
},
error: function ()
{
// on error, stop execution
clearInterval(i);
}
});
}, 1000);
});
Instead of doing an ajax post you can post to an iframe and have php generate incremental output and send it with the flush command.
// send a hash mark for every 1000 inserts
$a = 0;
while ($rec = getDataForNextInsert()){
$a++;
// do insert
if ($a%1000 == 0) { echo '#'; flush(); }
}
It would also then be possible to poll the contents of the iframe to provide a pretty display for the end user.
I hope this will be useful for you
First of all, you need to disable output buffer in your PHP script
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
ob_implicit_flush(1);
Then you need to echo your progress from the PHP during the process, something like that:
for($i=0;$i < 20;$i++){
echo ($i > 0 ? "#":"").($i/20*100);
sleep(1);
}
Then, in javascript you need to listen for xhr readystate change event and when this happens, just parse the response text and show the progress as you want to.
listening for event:
$.ajaxPrefilter(function( options, _, jqXHR ) {
if ( options.onreadystatechange ) {
var xhrFactory = options.xhr;
options.xhr = function() {
var xhr = xhrFactory.apply( this, arguments );
function handler() {
options.onreadystatechange( xhr, jqXHR );
}
if ( xhr.addEventListener ) {
xhr.addEventListener( "readystatechange", handler, false );
} else {
setTimeout( function() {
var internal = xhr.onreadystatechange;
if ( internal ) {
xhr.onreadystatechange = function() {
handler();
internal.apply( this, arguments );
};
}
}, 0 );
}
return xhr;
};
}
});
and sample of ajax request:
$.ajax({
url: "test.php",
cache: false,
onreadystatechange: function( xhr ) {
res = xhr.responseText.split("#");
$("#id").html(res[res.length-1] + "% done<br/>");
}
}).done(function( data ) {
$("#id").append("all done!</br>");
});
});
tested with jQuery 1.5+

Why my server freezes when using long pull with Ajax?

When I'm taking my query using Ajax on jQuery, o try use the method of long pull but my server is shutdown or dont response, my website becomes too slow like in stanby, or freezed what can i do?
MY PHP;
session_start();
$chat=new chat;
$class=new tools;
$idenvia=isset($_GET['idenvia'])? $_GET['idenvia']: '';
$idreceptor=isset($_GET['idreceptor'])? $_GET['idreceptor']: '';
$cantidad= isset($_GET['cantidad'])? $_GET['cantidad']: '';
$control=$_GET['control'];
//$ultima_modif=isset($_SESSION['fecha'])?$_SESSION['fecha']:0;
if($control==1){
echo json_encode($chat->leer_chat($idreceptor,$idenvia,$control,$cantidad,NULL));
}
if($control==2){
$dir='log/log_'.$_SESSION['id'].'.txt';
$ultima_modif=filemtime($dir);
$modifica_actual=isset($_GET['tiempo'])? $_GET['tiempo']: 0;//0
set_time_limit(0);
while($ultima_modif<=$modifica_actual){
clearstatcache();
$ultima_modif=filemtime($dir);
sleep(1);
//echo '{"0":{"activo":2}}';
//flush();
}
$res=$chat->leer_chat($_SESSION['id'],NULL,$control,$ultima_modif);
echo json_encode($res);
//unlink($dir);
flush();
}
This is my JQUERY code, here i take th response of my php with AJAX
function leer_chat_interval(){
$.ajax({
url:enlace,
type:'GET',
async:true,
data:{'control': 2,'tiempo':tiempo},
success:function(dato){
eval('var json='+dato);
if(json[0].activo==1){
//if(json.length!=0){
leer_chat(json[0].idenvia,json[0].idrecibe,1,json[0].nombre,json[0].mifoto,1,1);
$('#msg_chat'+json[0].idrecibe).attr('name',"{'recibe':'"+json[0].idenvia+"','envia':'"+json[0].idrecibe+"','foto':'"+json[0].mifoto+"'}");
setTimeout(function(){
$('#header_chat'+json[0].idenvia).css('background-color','#09C')
setTimeout(function(){
$('#header_chat'+json[0].idenvia).css('background-color','#F90')
},1000)
},1000);
tiempo=json[0].tiempo;
noerror=true;
}else{noerror=false;}
},
datatype:"json",
complete:function(dato){
if(!noerror){
setTimeout(function(){
leer_chat_interval()
},5000)
}else{
leer_chat_interval();
noerror=false;
}
},
timeout:30000
});
}
Your problem is this:
while(1)
You are not supposed to have a script looping infinitely to handle your ajax "long polling"; instead each ajax call runs through a finite request, collects the result, and repeat. What is happening in your example is that everytime your ajax request fires, a new infinitely running script is started; naturally the server collapses after a accumulating several of these.

Cross Domain AJAX (getJSON) with long polling?

I was wondering if it's possible to long poll using $.getJSON and what the proper front and back end logic would be.
I've come up with this so far but haven't tested it yet since I'm pretty sure there is wrong and/or missing logic.
Here is the JS:
function lpOnComplete(data) {
console.log(data);
if (!data.success) {
lpStart();
}
else {
alert("Works!");
}
};
function lpStart() {
$.getJSON("http://path.to.my.URL.php?jsoncall=?", function(data) {
// What happens when no data is returned
// This is more than likely since there
// is no fall back in the PHP.
lpOnComplete(data);
});
};
PHP:
$time = time();
while((time() - $time) < 30) {
// only returns data when it's new.
$data = checkCode();
// What would be the proper way to break out
// and send back $data['success'] = false
// so the JS loop can continue?
if(!empty($data)) {
echo $_GET["jsoncall"] . "(" . json_encode($data) . ")";
break;
}
usleep(25000);
}
From what you've got there, the Javascript is going to make multiple requests to the server and each one is going to spin up that infinite loop, and never go anywhere. I'd suggest something like: js:
$.getJSON("http://my.site/startAsyncWork.php", null, function(data){
waitUntilServerDone(data.token, function(response){
alert("done");
});
});
function waitUntilServerDone(token, doneCallback){
$.getJSON("http://my.site/checkIfWorkIsDone.php", {"token": token}, function(response){
if(response.isDone){
doneCallback(response);
}
else{
setTimeout(function(){
waitUntilServerDone(token, doneCallback);
}, 1000);
}
});
}
I don't know php, so I'm not going to write sample code for that side, but basically, startAsycWork.php makes up a random token that associates to the request. Then it spawns a thread that does all the work needed, and returns the token back to the response.
When the worker thread is done, it writes the results of the work out to a file like token.dat (or puts it in a cache or whatever).
checkIfWorkIsDone.php checks for the existence of token.dat, and returns false if it doesn't exist, or returns the contents if it does.

Reverse Ajax implementation using php

I am looking to implement reverse ajax in my application which is using PHP and jquery. I have googled a bit about it and found XAJA but that seems to be a paid application. Is there an open source application available for the same or has someone implemented it?
Some pointers or hints would be very helpful.
Thanks in advance.
I know of two types of reverse AJAX:
1- Polling
2- Pushing
I think polling is rather easier to implement, you just have your javascript make a regular request to the server every time interval, and when the server have some data for it it will respond. Its like a ping and some call it heartbeat, but its the very obvious solution for this problem. However it may easily overload the server.
EDIT Simple polling Example code:
Server-Side:
<?php
//pong.php php isn't my main thing but tried my best!
$obj = new WhatsNew();
$out = "";
if ($obj->getGotNew()){
$types = new array();
foreach ($obj->newStuff() as $type)
{
$new = array('type' => $type);
$types[] = $new;
}
$out = json_encode($types);
}
else{
$out = json_encode(array('nothingNew' => true));
}
Client-Side:
function ping(){
$.ajax(
{
url : "pong.php",
success : function (data){
data = JSON.parse(data),
if (data['nothingNew'])
return;
for(var i in data){
var type = data[i]['type'];
if (type && incomingDataHandlers[type]){
incomingDataHandlers[type]();
}
}
});
}
incomingDataHandlers = {
comments: function () {
$.ajax({
url: "getComments.php",
method: "GET",
data: getNewCommentRequsetData() // pass data to the server;
success : function (data){
//do something with your new comments
}
});
},
message: function (){
$.ajax({
url: "getMessages.php",
method: "GET",
data: getNewMessageRequestData() // pass data to the server;
success : function (data){
//do something with your new messages
}
});
}
}
$(docment).ready(function () {
setInterval(ping, 1000);
})
You are looking for what they call "long poll" - I did a "long poll php" and I got this thread on stack overflow:
How do I implement basic "Long Polling"?
you could websockets in conjuction with "flash" websockets because almost all browser have flash on board(average around 96%? => http://www.statowl.com/flash.php) => https://github.com/gimite/web-socket-js. You could use this together with http://code.google.com/p/phpwebsocket/. Still I am wondering if the performance is going to be any good. If it all possible I would use node.js to do reverse ajax. http://socket.io is a really cool project to do this!
Have you checked APE ?
Its a push based real-time data streaming technology over a single low volume ajax connection. The concept is useful, you may be able to replicate the same thing with your server-side implementation

Categories