I am working on progress bar which updates progress using ajax requests and session variables. When my program performs time consuming operation such as sending many emails etc. it just set proper session variable (which contains progress value). This operation is started by function post() in code below.
In the meantime, the second function ask() is performed in loop every 500ms. It should return current progress in real time. And here is the problem: every request sent by ask() are waiting for request sent by post() function is finished. The funny part is that if I set some URL like google.com instead of url/to/progress it works just fine except that it is not what I want :). Which means that problem is on the server side.
Not sure if it's important but I use Yii Framework.
All the code below is only scratch (but working) and its only purpose is to show what I meant.
Thanks in advance.
Sorry for my poor english :)
View part:
<script type="text/javascript">
function ask() {
var d = new Date();
var time = d.getTime();
$.ajax({
type: 'get',
url: '/url/to/progress' + '?time=' + time,
success: function(data) {
$("#progress").html(data);
}
})
}
function post() {
var d = new Date();
var time = d.getTime();
$.ajax({
type: 'post',
url: '/url/to/post' + '?time=' + time,
data: {"some": "data"},
success: function(data) {alert(data)}
});
}
$("#test").click(
function() {
post();
var progress = setInterval("ask();", 500);
}
);
</script>
Controller part:
public function actionPost($time) {
sleep(5); // time consuming operation
echo $time . ' : ' . microtime();
exit;
}
public function actionProgress($time) {
echo $time . ' : ' . microtime();
exit;
}
I think your problem here is session related.
When a script has an open session, it has a lock on the session file. This means that any subsequent requests which use the same session ID will be queued until the first script has released it's lock on the session file. You can force this with session_write_close() - but this won't really help you here, as you are trying to share the progress info with the session file so post script would need to keep the session data open and writable.
You will need to come up with another way of sharing data between the post and progress scripts - if post has the session data open throughout it's execution, progress will never be able to access the session data until after post has finished executing. Maybe you could use the session ID to create a temporary file which post has write access to, in which you put the progress indicator data. The progress can check the file and return that data. There are many options for IPC (inter-process communication) - this is not a particularly beautiful one but it does have the advantage of maximum portability.
As a side note - please don't pass strings to setInterval(), pass functions. So your line should actually read:
var progress = setInterval(ask, 500);
But - it would be better to use setTimeout() in the success/error handlers of the ask() ajax function. This is because by using setInterval(), a new request will be initiated regardless of the state of the previous one. It would be more efficient to wait until the previous request has finished before initiating the next one. So I would do something more like this:
<script type="text/javascript">
// We'll set this to true when the initail POST request is complete, so we
// can easily know when to stop polling the server for progress updates
var postComplete = false;
var ask = function() {
var time = new Date().getTime();
$.ajax({
type: 'get',
url: '/url/to/progress' + '?time=' + time,
success: function(data) {
$("#progress").html(data);
if (!postComplete)
setTimeout(ask, 500);
}
},
error: function() {
// We need an error handler as well, to ensure another attempt gets scheduled
if (!postComplete)
setTimeout(ask, 500);
}
}
});
}
$("#test").click(function() {
// Since you only ever call post() once, you don't need a seperate function.
// You can just put all the post() code here.
var time = new Date().getTime();
$.ajax({
type: 'post',
url: '/url/to/post' + '?time=' + time,
data: {
"some": "data"
},
success: function(data) {
postComplete = true;
alert(data);
}
error: function() {
postComplete = true;
}
});
if (!postComplete)
setTimeout(ask, 500);
}
});
</script>
...although this still doesn't fix the session problem.
#DaveRandom above correctly points out that you are a victim of a session storage lock.
The workaround is quite simple. You want to make the script that processes post() release the lock on the session data so that the script that handles ask() can access this session data. You can do that with session_write_close.
The fine print here is that after calling session_write_close you will not have access to session variables, so you need to structure the script for post accordingly:
Read all data you will need from $_SESSION and save a copy of it.
Call session_write_close to release the session lock.
Continue your lengthy operation. If you need session data, get it out of your own copy and not $_SESSION directly.
Alternatively, you can toggle the lock on the session multiple times during the script's lifetime:
session_start();
$_SESSION['name'] = 'Jon';
// Quick operation that requires session data
echo 'Hello '.$_SESSION['name'];
// Release session lock
session_write_close();
// Long operation that does not require session data.
sleep(10);
// Need access to session again
session_start();
echo 'Hello again '.$_SESSION['name'];
This arrangement makes it so that while the script it sleeping, other scripts can access the session data without problems.
Related
I have a form that is validated then submitted with the following handler :
submitHandler:function (form) {
$.ajax({
url: 'long_process.php',
type: 'POST',
data: $(form).serialize(),
success: function (result) {
// ... Redirect ...
}
});
//start polling
(function poll() {
setTimeout(function () {
$.ajax({
url: "get_progress.php",
success: function (data) {
console.log(data);
//Setup the next poll recursively
poll();
},
});
}, 3000);
})();
}
long_process.php takes about 30s to finish and in the meantime I'd like to track the progress via get_progress.php which echo the percentage of processing done.
When launching this script I get in the console (edited):
1 POST long_process.php
2 GET get_process.php (3 seconds later)...
...stuck here until long_process.php finishes THEN
3 GET get_process.php (3 seconds later)...
4 GET get_process.php (3 seconds later)...
...
but none of the get_progress.php return any values until the long_process.php is finished.
How can I achieve multiple simultaneous ajax request ? Ultimately this will be used to display a progress bar.
If you are using sessions in your PHP then the one call will block the other, because PHP won't allow two requests to use the same session space at the same time. Your two requests are probably firing, but you are not getting a request to the second until the server finishes with the first.
To solve this:
Option 1: Don't use sessions in the first script that you are calling or, if you must, then unlock the session using session_write_close(). After calling this, of course, you can't write any session variables.
Option 2: If reading and writing session variables is essential then don't use session variables in the second AJAX call and don't declare a session start. Have your first script put it's status somewhere else for you to read (in a DB, a file on /tmp) and then have the second script read the status from there.
Hope that helps!
Using a html element for search, my search allows GET and POST requests. I am sending the pressed keys using ajax GET for live search (timeout 750ms) and when the form is submitted, the string is POSTED.
The problem is that new session data is not saved when the ajax request and the form is submitted at the same time.
I found two solutions to this problem but they seem to mask the real problem. First solution is to stop the ajax request when enter is pressed using javascript if ((event.keyCode || event.which) == 13) and 2nd solution is calling session_write_close() and session_start() right after new session data is saved.
Can you explain why some session data is not saved (while others are properly saved during same request) when ajax GET is executed while a html form is in the middle of being posted or else explain the need for calling session_write_close and session_start to make sure session data is saved during critical operations like checking CSRF: generate new token if post is valid and save this in session?
Example code:
PHP Wrapper for storing new key:
public function setCsrfKey($property)
{
$_SESSION['x']['CsrfKey'] = (string) $property;
}
JS code:
$(searchBox).on("keyup", function(e){
e.stopPropagation();
clearTimeout(ajaxTimeOut);
var keyPressed = e.keyCode || e.which;
var string = $(this).val();
if (string.length < 3) {
queueList.clearQueue('ajaxCall');
return;
}
queueList.queue('ajaxCall', function(){
$.ajax({
url: urlString,
type: 'GET',
}).done (function( data ) {
}).fail (function(){
});
});
ajaxTimeOut = setTimeout( function(){
while (queueList.queue('ajaxCall').length > 1) {
queueList.queue('ajaxCall').shift();
}
queueList.dequeue('ajaxCall');
}, 750);
});
PHP writes session data when the script ends unless you tell it otherwise (with session_write_close). What happens with two simultaneous requests in the same session depends on your session handler...but if the request is allowed to happen, the two requests typically won't see each other's changes to the session, and one's changes will generally get lost.
Calling session_write_close saves the session so that future requests will use the updated session. It's a bit of a hack; what you have here is a race condition, and there's still a chance of stuff breaking. It'll just be a lot lower the sooner you can commit your changes to the disk/database/whatever. Lower still if you can insert a short delay between requests as well.
Of course, once you've closed the session, it won't get saved when the script ends, so nothing else will get added to it. You have to call session_start() to reopen it if you want to make further changes to it.
Once my page is loaded, I perform an Ajax call to a php script, which updates my server. However this script can sometimes take over a minute to complete, and while the script is running, I am unable to perform other Ajax calls, which I need to handle - i.e the first Ajax call should not interrupt the other Ajax calls. Any idea how to do this?
First Ajax call:
$(document).ready(function () {
$.ajax({
url: "checkForUpdatesByCoach.php",
success: function(arg){
if(arg == "200"){
$('body').prepend("<div style='text-align:center;margin-bottom:-12px;' onClick='location.reload()' class='alert alert-success'>Dine hold er blevet opdateret.Tryk for at opdatere!</div>").fadeIn("slow");
}
}
});
});
Second Ajax call (a user triggered call):
$.ajax({
type: "POST",
data: {data:dataAjax},
url: "updateSwimmer1.php",
success: function(arg){
//updating UI
}
});
adeno's comment above is correct.
"in PHP only one script at a time can operate on the same session, so
as to not overwrite session data etc. So when doing two ajax calls to
PHP scripts within the same session, the second has to wait for the
first to finish"
to help speed things up you can write to and end a session early(session_write_close()) to release the session-lock and allow another script using the session to continue.
note: you can still read from your $_SESSION variable after calling session_write_close but you may no longer write to it.
you can find a good example of this here: PHP Session Locks – How to Prevent Blocking Requests
example provided from the link above:
<?php
// start the session
session_start();
// I can read/write to session
$_SESSION['latestRequestTime'] = time();
// close the session
session_write_close();
// now do my long-running code.
// still able to read from session, but not write
$twitterId = $_SESSION['twitterId'];
// dang Twitter can be slow, good thing my other Ajax calls
// aren't waiting for this to complete
$twitterFeed = fetchTwitterFeed($twitterId);
echo json_encode($twitterFeed);
?>
I've made a simple PHP jQuery Chat Application with Short Polling (AJAX Refresh). Like, every 2 - 3 seconds it asks for new messages. But, I read that Long Polling is a better approach for Chat applications. So, I went through some Long Polling scripts.
I made like this:
Javascript:
$("#submit").click(function(){
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'}
});
});
var getNewMessage = function() {
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'},
function(data){
alert(data);
}
});
getNewMessage();
}
$(document).ready(getNewMessage);
PHP
<?php
$time = time();
while ((time() - $time) < 25) {
$data = $db->getNewMessage ();
if (!empty ($data)) {
echo json_encode ($data);
break;
}
usleep(1000000); // 1 Second
}
?>
The problem is, once getNewMessage() starts, it executes unless it gets some response (from chat-handler.php). It executes recursively. But if someone wants to send a message in between, then actually that function ($("#submit").click()) never executes as getNewMessage() is still executing. So is there any workaround?
I strongly recommend that you read up on two things: the idea behind long polling, and jQuery callbacks. I'll quickly go into both, but only in as much detail as this box allows me to.
Long polling
The idea behind long polling is to have the webserver artificially "slow down" when returning the request so that it waits until an event has come up, and then immediately gives the information, and closes the connection. This means that your server will be sitting idle for a while (well, not idle, but you know what I mean), until it finally gets the info that a message went through, sends that back to the client, and proceeds to the next one.
On the JS client side, the effect is that the Ajax callback (this is the important bit) is delayed.
jQuery .ajax()
$.ajax() returns immediately. This is not good. You have two choices to remedy this:
bind your recursion call in the success and error callback functions (this is important. the error function might very well come up due to a timeout)
(see below):
Use This:
var x = $.ajax({blah});
$.when(x).done(function(a) { recursiveCallHere(); });
Both amount to the same thing in the end. You're triggering your recursion on callback and not on initiation.
P.S: what's wrong with sleep(1)?
In long polling new request should be initiated when you have received the data from the previous one. Otherwise you will have infinite recursion in browser freezing.
var getNewMessage = function() {
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'},
success: function(data) {
alert(data);
getNewMessage(); // <-- should be here
}
});
}
I'm doing a long poll method chatroom. But it seems that, when a long poll occurs and I refresh the page in chrome OR i try to send another async request everything times out (i.e i cant load my domain again until i close/reopen the browser).
My client side code is:
$(document).ready(function() {
setTimeout(
function () {
longPollForMessages();
},
500
);
});
function longPollForMessages()
{
$.ajax({
url: url,
dataType: 'json',
success: function(data) {
$('#chat_messages').append('<div>'+data.messages+'</div>');
longPollForMessages();
}
});
}
And my serverside:
while(true) {
$messages = $db->getMessages();
if (!$messages || sizeof($messages)==0) {
sleep(1);
} else {
echo '{"message":'.json_encode($messages).'}';
die();
}
}
Any ideas? Assume no syntax errors.
I can see you have already answered your own question, but I recently had a similar problem and found another way to handle it is to disable the setTimeout on ajax call, then restart it on success. That way you aren't pinging your server for information when it isn't ready to give it.
I figured it out from this question: stackoverflow.com/questions/4457178/… - php locks a given session until the page is done loading so the second ajax call wasn't able to go through. You have to release the lock by calling session_write_close();