I have a File which process many other files and may take upto 30mins to process. I do an AJAX request to the file in the front end. The file outputs to another temporary file regarding the percentage of completion. And when it finishes it outputs a success message as its own XML output (not to the tmp file).
The problem i am having is, when the processing time is small .. say max upto 3mins, the AJAX request (made through jQuery) stays alive. But a time out occurs when the processing takes longer time (above 4mins). And the AJAX connection is cut. How do i prevent it and make it stay alive till the browser is closed?
You won't be able to do that. Unless it is a comet server, that can keep the connection alive at the server side and when there is any update to the data, it pushes out the contents.
In your case, the only way i can think of is doing this:
function ajax_call () {
$.ajax({
url : 'get_file_processing_output.html',
success : function (response) {
check your response, if file processing is not finished, then call ajax_call() again
if it is finished, then just do whatever you need.
},
timeout : function () {
time out then directly call ajax_call() again, maybe with a time interval would be better
}
})
}
I have a success call back above in ajax, because i feel you should response something from your server side to tell the client that the processing is not yet done.
Related
So, I'm sending a form with ajaxForm which will send data, open spinner.gif, then on success close spinner and reload the page:
$('#form').ajaxForm({
beforeSubmit:function(){
spinnerLoad();},
success: function(data){
spinnerDone();
window.location.href ="sample.php";
}
});
Then the form is handled like this:
if (isset($_POST['save'])){
exec("/directory/script.php $args");
}
So this page, 'script.php' executes another script on DB, so it may take a long time. When there is not many data, it works fine, but whenever I have much, after a time 'script.php' wents 404, and the spinner.gif never stops.
I need to find a way to extend timeout somehow (ajax timeout option seems not suitable) or another way.
Sending the script.php page or the DB script to background is not ok - it must be finished in order to continue working.
I'll be glad to any comments/directions to look.
I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];
To import a js-file is simple... just use >script src='file.js' type='text/javascript'>>/script>.
But then the source code will show a direct url to the contents of the file.
The content should be executable, but not directly viewable by using source url.
What is the best way to load the content of file.js to memory using AJAX.
I've come to the following initial way of working (just and idea, totally flawed?):
function get_contents() {
-> ajax execute PHP {copy 'file.js' to 'token.js' in tmp-directory}
-> ajax get contents of 'tmp/token.js' and load to memory
-> ajax execute PHP {delete 'tmp/token.js' in tmp-directory}
return(true); // content (ie. functions) should now be usable
}
But I'm not sure if the second ajax excute is enough to now be able to succesfully call the functions.
PHP returns content, but javascript 'ajax success' may see it (and stores it) as an variable... doh!
Is this ajax success idea going to work ??
Can someone suggest a better idea ?
Edit:
According to initial responses this way of working is virtually and humanly impossible.
Will solve it by loading common functions the 'normal' unprotected way, and using Jerry's suggestion (see comment) for calculations that happen less often.
Edit #2:
Below mentioned (time consuming) problem can be solved by following next template.
Still making use of the suggested 'hidden PHP code' method.
I am making use of a buffer (or sumthing), like a Youtube video... except 'video data' is 'results from AJAX-PHP functions'.
AJAX request "30 cycle", "60 cycle", "300 cycle", "600 cycle"
store result to buffer
initiate "start cycle"
function cycle() // run every second !!
{
//do stuff... no AJAX needed
//do some more stuff... like animations and small calculations
//per 30 cycle (30 seconds)
if ($cycle==30)
{
perform last "30 cycle" AJAX result [PHP-function set "A"]
... when finished: AJAX request "30 cycle"
store result to buffer in 'background'
}
//per 60 cycle (1 minute)
if ($cycle==60)
{
perform last "60 cycle" AJAX result [PHP-function set "B"]
... when finished: AJAX request "60 cycle"
store result to buffer in 'background'
}
//and so on....
}
Initial question 99% solved (-1 because of developer tools).
Thanks for commenting and suggestions.
Even if you split up your JavaScript files into lots of individual functions it would take very little work to put it all back together again.
With modern browsers even a file is "loaded into memory" you can see exactly what was loaded.
Try using the developer tools that come with browsers, you can use them to see when an Ajax call is made, exactly what was loaded in text format.
If you are mixing PHP and JavaScript then you should put anything that is sensitive in your PHP code whilst using JavaScript for your presentation of the results PHP provides.
EDIT: as per your update, instead of doing "cycles" could you not do this?
function pollAjax()
{
$.ajax({
-- ajax settings --
}).success(function(data) {
// do something with our results
doSomething(data);
// Fire again to get the next set of results.
setTimeout(function() {pollAjax()}, 10);
});
}
This means you're less likely to hang the browser with 1000's of pending ajax requests It will ask for the results, when it gets them, it will ask for the next set of results.
I have constructed a PHP file which scrapes a web page (using cURL) to obtain some data, and outputs it to the screen in JSON format.
The target website involves some redirects which temporarily outputs data to my PHP file. Once the redirects have completed successfully, the JSON is presented as expected. The problem that I am encountering is that when I try to access the JSON using jQuery's $.ajax() method, it sometimes returns the incorrect data, because it isn't waiting for the redirects to complete.
My question is if it's possible to tell the AJAX request to wait a certain number of seconds before returning the data, thus allowing time for the redirects in the PHP script to execute successfully?
Please note that there is no cleaner solution for the page scrape, the redirects are essential and have to be outputted to the screen for the scraping to complete.
There's always timeout in the settings.
jQuery docs:
timeout Number
Set a timeout (in milliseconds) for the request. This will
override any global timeout set with $.ajaxSetup().
The timeout period starts at the point the $.ajax call is made;
if several other requests are in progress and the browser
has no connections available, it is possible for a request
to time out before it can be sent. In jQuery 1.4.x and below,
the XMLHttpRequest object will be in an invalid state if
the request times out; accessing any object members may
throw an exception. In Firefox 3.0+ only, script and JSONP
requests cannot be cancelled by a timeout; the script will
run even if it arrives after the timeout period.
You should use promise() in jQuery.
You could always store the result of your ajax call and then wait for the redirects to finsih, i.e.:
$.ajax({
success: function(e)
{
var wait = setTimeout(function(){ doSomethingWithData(e.data); }, 5000); //5 sec
}
})
Alternatively, you could set up an Interval to check if something happened (redirect finished) every x amount of ms. I'm assuming your redirects are letting you know they completed?
http://examples.hmp.is.it/ajaxProgressUpdater/
$i=0;
while (true)
{
if (self::$driver->executeScript("return $.active == 0")) {
break;
}
if($i == 20) {
break;
}
$i++;`enter code here`
echo $i;
usleep(10000);
}
I have a bunch of AJAX requests, the first is a
$.post("go.php", { url: pastedURL, key: offskey, mo: 'inward', id: exID});
This request, in go.php includes a system command, which despite nohup and & at the end, doesn't stop whirling on Firebug - waiting for response.
Unfortunately, the next AJAX request
$.post("requestor.php", { request: 'getProgress', key: offskey}, function(data) { console.log(data.response); }, "json");
Doesn't run, it whirls round in firebug (i'm guessing until go.php has finished) - it overloads everything eventually (this is on a timer to check every few seconds).
So I guess the question is, is there an AJAX method which simply throws data and walks away, instead of waiting for response... or someway I can perform another request whilst the other is waiting.
Hope someone knows what I mean.
Check for async: false, as parameter for $.ajax() method in jQuery.
You need it to be set async: true.
Turns out the PHP request on the other end would need this in order to proceed with waiting:-
"> /dev/null 2>/dev/null &"
source
There are a limited number of connections used by the browser. The number varies by browser, but for older ones like IE7 it can be as little as 2 connections per server. Once you fill them with pending requests you have to wait for those to complete before you can make any additional requests to that server.
(this is on a timer to check every few seconds)
It sounds like you may be making additional requestor.php requests using setTimeout or setInterval before the previous ones have finished? If so, don't do that. Use a full $.ajax (perhaps with a timeout as well) and in the success/error callback only then make another request.