I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];
Related
Thank you for reading.
I have an input field that sends its contents in an XMLHttpRequest to a php script. The script queries the database with the POST data from the field and returns the results.
Because the XMLHttpRequest is invoked using onkeyup, typing in a lengthy value sends several calls in a short period. To combat this I wrote some code that creates a timestamp, loads it into the session, sleeps, then rechecks the timestamp. if the timestamp has increased, it means a subsequent call was made and the script should abort. Otherwise the script executes. Here is the code.
$micro = microtime(true);
$_SESSION['micro'] = $micro;
usleep(500000); // half a second
if ($micro < floatval($_SESSION['micro']))
{
// later call has been made, abort
echo 'abort';
exit;
}
else
{
// okay to execute
}
The code appears to work as expected at first. If I add or remove a character or two from the input field the result appears quickly.
However if I type a good 12 characters as fast as I can there is a large delay, sometimes 2 or 3 seconds long.
I am working on localhost, so there is no connection issues. The query is also really small, grabbing one column containing a single word from a specific row.
I have also set XMLHttpRequest to be asynchronous, so that should also be fine.
xmlhttp.open("POST","/test/",true);
If I remove the flood prevention code, typing in the field returns results instantly - no matter how much and how quickly I type.
It's almost as if usleep() keeps stacking itself or something.
I came up with this code on my own, best I could do at my level. No idea why it isn't behaving as expected.
Help is greatly appreciated, thanks!
When you open a session using session_start(), PHP locks the session file so any subsequent requests for the same session while another request has it open will be blocked until the session closes (you were exactly right with the "stacking" you suspected was happening).
You can call session_write_close() to close the session and release the lock but this probably won't help in this situation.
What's happening is each time the key is pressed, a request gets issued and each one is backed up while the previous one finishes, once the session is released one of the other requests opens the session and sleeps, and this keeps happening until they've all finished.
Instead, I'd create a global variable in Javascript that indicates whether or not a request is in progress. If one is, then don't send another request.
Something like this:
<script>
var requesting = false;
$('#input').on('keyup', function() {
if (requesting) return ;
requesting = true;
$.ajax({
url: "/url"
}).done(function() {
requesting = false;
});
}
</script>
drew010's answer explained my problem perfectly (Thanks!). But their code example, from what I gather by how it was explained (I didn't try it), does the opposite of what I need. If the user types "hello" the h will get sent but the ello might not unless the result makes it back in time. (Sorry if this was a wrong assumption)
This was the solution I came up with myself.
<input type="text" onkeyup="textget(this.value)" />
<script>
var patience;
function ajax(query)
{
// XMLHttpRequest etc
}
function textget(input)
{
clearTimeout(patience);
patience = setTimeout(function(){ajax(input)},500);
}
</script>
when a key is pressed in the input field, it passes its current value to the textget function.
the textget function clears an existing timer if any and starts a new one.
when the timer finishes counting down, it passes the value further to the ajax function to perform the XMLHttpRequest.
because the timer is reset every time the textget function is called, if a new call is made before the timer finishes (0.5 seconds), the previous call will be lost and is replaced by the new one.
Hope this helps someone.
I have problem with two simultaneous AJAX requests running. I have a PHP script which is exporting data to XSLX. This operation take a lot of time, so I'm trying to show progress to the user. I'm using AJAX and database approach. Actually, I'm pretty sure it used to work but I can't figure out why, it's no longer working in any browser. Did something change in new browsers?
$(document).ready(function() {
$("#progressbar").progressbar();
$.ajax({
type: "POST",
url: "{$BASE_URL}/export/project/ajaxExport",
data: "type={$type}&progressUid={$progressUid}" // unique ID I'm using to track progress from database
}).done(function(data) {
$("#progressbar-box").hide();
clearInterval(progressInterval);
});
progressInterval = setInterval(function() {
$.ajax({
type: "POST",
url: "{$BASE_URL}/ajax/progressShow",
data: "statusId={$progressUid}" // the same uinque ID
}).done(function(data) {
data = jQuery.parseJSON(data);
$("#progressbar").progressbar({ value: parseInt(data.progress) });
if (data.title) { $("#progressbar-title").text(data.title); }
});
}, 500);
});
the progress is correctly updating in database
the JS timer is trying to get the progress, I can see it in console, but all these request are loading the whole duration of the first script, as soon as the script ends, these ajax progress calls are loaded
So, why is the second AJAX call waiting for the first one to finish?
Sounds like a session blocking issue
By default PHP writes its session data to a file. When you initiate a session with session_start() it opens the file for writing and locks it to prevent concurrent edits. That means that for each request going through a PHP script using a session has to wait for the first session to be done with the file.
The way to fix this is to change PHP sessions to not use files or to close your session write like so:
<?php
session_start(); // starting the session
$_SESSION['foo'] = 'bar'; // Write data to the session if you want to
session_write_close(); // close the session file and release the lock
echo $_SESSION['foo']; // You can still read from the session.
After a bit of hair-pulling, I found one other way that these non-parallel AJAX requests can happen, totally independent of PHP session-handling... So I'm posting it here just for anyone getting here through Google with the same problem.
XDebug can cause this, and I wouldn't be surprised if Zend Debugger could too.
In my case, I had:
XDebug installed on my local LAMP stack
xdebug.remote_autostart enabled
My IDE accepting inbound debugger-connections, even though no breakpoints were active
This caused all my AJAX tests to run sequentially, no matter what. In retrospect it makes a lot of sense (from the standpoint of debugging things) to force sequential processing, but I simply hadn't noticed that my IDE was still interacting behind-the-scenes.
After telling the IDE to stop listening entirely, parallel runs resumed and I was able to reproduce the race-condition I had been looking for.
Be aware, that session_write_close()(answer of chrislondon) may not resolve the problem if you have enabled output buffering (default in PHP 7+). You have to set output_buffering = Off in php.ini, otherwise session won't be closed correctly.
When working with APIs, you sometimes need to issue multiple AJAX requests to different endpoints. Instead of waiting for one request to complete before issuing the next, you can speed things up with jQuery by requesting the data in parallel, by using jQuery's $.when() function:
Run multiple AJAX requests in parallel
a.php generates a main HTML page that contains two simultaneous AJAX calls to b.php and c.php. In order for b.php and c.php to share session variables, the session variables must exist BEFORE the first AJAX call. Provided this is true, a.php and b.php can change the value of the session variables and see each other's values. Therefore, create the session variables with a.php while generating the HTML page. At least that's how it works with Rogers shared web hosting.
You could also set
async: true,
Once my page is loaded, I perform an Ajax call to a php script, which updates my server. However this script can sometimes take over a minute to complete, and while the script is running, I am unable to perform other Ajax calls, which I need to handle - i.e the first Ajax call should not interrupt the other Ajax calls. Any idea how to do this?
First Ajax call:
$(document).ready(function () {
$.ajax({
url: "checkForUpdatesByCoach.php",
success: function(arg){
if(arg == "200"){
$('body').prepend("<div style='text-align:center;margin-bottom:-12px;' onClick='location.reload()' class='alert alert-success'>Dine hold er blevet opdateret.Tryk for at opdatere!</div>").fadeIn("slow");
}
}
});
});
Second Ajax call (a user triggered call):
$.ajax({
type: "POST",
data: {data:dataAjax},
url: "updateSwimmer1.php",
success: function(arg){
//updating UI
}
});
adeno's comment above is correct.
"in PHP only one script at a time can operate on the same session, so
as to not overwrite session data etc. So when doing two ajax calls to
PHP scripts within the same session, the second has to wait for the
first to finish"
to help speed things up you can write to and end a session early(session_write_close()) to release the session-lock and allow another script using the session to continue.
note: you can still read from your $_SESSION variable after calling session_write_close but you may no longer write to it.
you can find a good example of this here: PHP Session Locks – How to Prevent Blocking Requests
example provided from the link above:
<?php
// start the session
session_start();
// I can read/write to session
$_SESSION['latestRequestTime'] = time();
// close the session
session_write_close();
// now do my long-running code.
// still able to read from session, but not write
$twitterId = $_SESSION['twitterId'];
// dang Twitter can be slow, good thing my other Ajax calls
// aren't waiting for this to complete
$twitterFeed = fetchTwitterFeed($twitterId);
echo json_encode($twitterFeed);
?>
I need to process a very long running process in PHP (grepping big text files and returning matching lines). When we run an average query from the command line, the process may take 10-15m.
I wanted to implement in PHP/jQuery, whereby I started the query, then showed incremental results as they came back.
I implemented something close, where I had one ajax call doing the search (worked fine), and had a periodic timer function running calling a second method in the class to get the results. However, I realized that the 2nd call would really create a new class instance, so the $this->current was different between the main query and the period update timer.
Here's the javascript I was trying (I was kicking it off when clicking a form button):
<script>
function update_status(data) {
alert(data);
jQuery.each(data, function(key, val) {
if ( key == "progress" )
$("#progressbar").progressbar({ value: val });
});
}
function progress_setup() {
setInterval(function() {
jQuery.ajax({
type:'POST',
dataType:'json',
complete:function(XMLHttpRequest, textStatus){
update_status(textStatus)
},
url:'<?php echo url_for("#grep_status"); ?>'
})},
2000);
}
function grep(elements) {
jQuery.ajax({
type:'POST',
dataType:'html',
data:jQuery(elements).serialize(),
success:function(data, textStatus){jQuery('#results').html(data);},
beforeSend:function(XMLHttpRequest){progress_setup()},
url:'/grep'});
}
</script>
But, this doesn't appear to be the right track. The core issue seems to be:
Long running task in PHP
How do you get the status of that task back to a progress bar, and an incremental results dialog?
TIA
Mike
You have to share the state of your operation either using a database or a file. Then in your /grep operation you periodically write the state in the database or the file (updating the state).
Then you need another script (like /grep_state) which reads the state and returns it to the client.
What you can't do is share the state using a PHP-object instance since this it's scope is limited to a single request. You have to persist the state.
The other Problem might be that your long running task is terminated because of a request timeout either by the webserver or the browser. I would either run the script in CLI-mode (detached from the webserver/request) or write a simple job-scheduler which runs as a daemon.
The daemon gets the parameters from a socket (or any other means of communicating with other processes) and starts the php-CLI with your worker-script. The state is also shared using files or a database.
Seems like the easiest thing to do is to split the task up in php and send some sort of flag back to your client side app to tell it when it's finished.
Maybe:
Get the size of the file.
Begin the query, return the first result with a character number.
Update progress bar.
Start next query beginning at last character number, return second result with character number.
Update progress bar.
Continue until reached end of file.
Although that wouldn't give you a perfect progress update, it would indicate how far you've searched into the file.
I think the key is to set up your server with parameters that allow you to limit/filter your query results.
You can try flushing your output early--this varies according to your server settings, however. For example:
// Script starts, does first block of program
echo "1";
flush();
// Second block starts and finishes
echo "2";
flush();
// Etc...
After each block completes, flush a response.
I have a File which process many other files and may take upto 30mins to process. I do an AJAX request to the file in the front end. The file outputs to another temporary file regarding the percentage of completion. And when it finishes it outputs a success message as its own XML output (not to the tmp file).
The problem i am having is, when the processing time is small .. say max upto 3mins, the AJAX request (made through jQuery) stays alive. But a time out occurs when the processing takes longer time (above 4mins). And the AJAX connection is cut. How do i prevent it and make it stay alive till the browser is closed?
You won't be able to do that. Unless it is a comet server, that can keep the connection alive at the server side and when there is any update to the data, it pushes out the contents.
In your case, the only way i can think of is doing this:
function ajax_call () {
$.ajax({
url : 'get_file_processing_output.html',
success : function (response) {
check your response, if file processing is not finished, then call ajax_call() again
if it is finished, then just do whatever you need.
},
timeout : function () {
time out then directly call ajax_call() again, maybe with a time interval would be better
}
})
}
I have a success call back above in ajax, because i feel you should response something from your server side to tell the client that the processing is not yet done.