I am making an AJAX call to a PHP script that takes a long time to run. Now, assume it takes 20 minutes to run. Also assume that I either refreshed or closed then opened the web page that intiated the call. My questions are:
Does my PHP scrip stop running?
If it keeps running, how can I force a response back to the page after it was refreshed?
Thanks.
************* UPDATE *************
Some are asking why I have a script that takes a long time. I think it is my fault of not explaining the following:
a. This script is only available to site admins, and is not available to the general public
b. This script will perform some heavy lifting, such as data manipulation and database related stuff, and will require a long time to run
c. I am using AJAX so that I could still return to the main page, with a spinner showing that the script is running, while the script actually runs in the back end, and the call back function will remove the spinner and display a message of either success/failure status.
No it will stop running.
You should be using a cron job instead to run a script that will take 20 minutes.
Dont forget to set the time limit to at least 20 minutes for that page!
Does my PHP script stop running?
Yes, PHP will cease execution if the browser disconnects (which it will on refresh/close). You can use ignore_user_abort(true) to prevent this, but I don't think that would be the best option for what you want.
If it keeps running, how can I force a response back to the page after it was refreshed?
No because the connection is gone, there is nothing to send a response back to. It would be more appropriate to invoke your long process as a background process when the ajax calls it, and immediately return a response, leaving the background process to work. You can use regular ajax calls to query if the process is complete (update a database for example on completion).
From the docs:
The default behaviour is however for your script to be aborted when the remote client disconnects.
Related
I have a page streaming mjpegs. I used ffmpeg to generate the mjpegs, and it uses enough CPU that I would like to only have it run when someone is actively viewing the page. My thought was to start it with exec() however, it keeps running when I leave the page, and actually starts multiple instances if I then go back to the page.
Is there a way to kill a process when someone is no longer on a page? My only thought was to use ajax to send a keep alive signal to another program on the server which would kill the process if the signal isn't recieved for > 10 seconds, however it seems like there must be a less convoluted method for doing this.
There's no way to know from PHP when a user leaves the page unless you make another request from javascript, your approach of the ajax request is a good idea. You can also use the javascript event onbeforeunload to make a request when the user unloads the page to terminate the process.
I am developing an web application using php and mysql along with AJAX. In one of my script there is a provision to fetch data from mysql table. But what if I want to cancel the execution of the php script which I am calling to get the data, in the middle of the execution? Let me clear it more. Like if it takes say 30 minutes to complete an AJAX call due to the heavy loop and I want to exit from that call before completion by clicking some button. How can I achieve that goal. Otherwise, my script is running well except that it hangs if I don't want to wait for the final AJAX response text and try to switch to other page of the web application.
You can create script like this:
$someStorage->set($sessionId . '_someFuncStop', true);
Call it through AJAX, when button STOP pressed.
In your script with loop check that var from storage
while(1) {
if ($someStorage->get($sessionId . '_someFuncStop') === true ) break;
}
To my best knowledge, PHP doesn't support some sort of event listeners that can interrupt running script by an external cause.
There are 2 paths you might want to consider (if you don't want to write shell scripts on the server that would terminate system processes that execute the script):
ignore_user_abort function, thoug it is not 100% reliable
http://php.net/manual/en/function.ignore-user-abort.php
Inside the loop you wish to terminate, create a database call (or read from a file), where you can set some kind of flag and if that flag is set, run a break/return/die command inside the script. The button you mentioned can then write to database/file and set the interrupt flag.
(In general this is not really useful for script interruptions, since most scripts run in tens of milliseconds and the flag would not be set fast enough to terminate the script, in tens of minutes however, this is a viable solution.)
I need to run a PHP script from another site, without the use of CRON, so that will be called whenever anyone comes or refreshes the page.
The script will perform some kind of update my database, it is possible that it takes several tens of seconds, so I needed to run the PHP script so that it also does not limit the site visitor, from where the script will be called.
But I do not want to make the script really starts up every time someone visits or refreshes the page, I would like to limit one minute and so, before calling the script, I would like to put into MySQL database current time someone (who is the one - the one who was first) arrives or refreshes the page, and in the case where someone just to update the page is first compares the current time with the database from the last call script, and if the difference is less than one minute, so the script does not call, but if more than one minute is executed while the database again writes the current time with the last script execution.
I do not need any response from running the script.
Importantly, it shall not affect the page loading user, where it should be called.
Thanks for help
You could do a jQuery AJAX call in background after the page is load, so the user wont wait the script finish to load the page
http://api.jquery.com/jQuery.ajax/
However, I do not think the way you want to do it is correct. It's possible, but Im not sure if it's usefull.
Can you split your script in different tasks?? So you can do them before loading the page, and the users wont notice any difference.
Ok, if javascript is not an option you can do a little bit more research on php forking. It's basically php version of Thread but much more limited. So you can actually fork a child php process to run in the background while the main process still doing the usual thing. So it won't affect your day to day process.
http://php.net/manual/en/function.pcntl-fork.php
I have PHP code that calls a SOAP request to a remote server. The remote server processes the SOAP request and I can then fetch the results.
Ideally I would like to call the SAOP request, wait 5 seconds, and then go and look for the results. The reason being the remote server takes a couple of seconds to finish it's processing. I have no control over the remote server.
At present I have this code:
$object = new Resource_Object();
$identifier = $_GET['id'];
$object->sendBatch($id));
sleep(5);
$results = $object->getBatchReport();
echo $results;
The problem with the above code is sendBatch takes a few seconds to complete. After adding sleep(5) the page take 5 seconds longer to load, but still the results are not displayed. If I load the page again, or call getBatchReport() from another page, the results are there.
I guess this has something to do with the statelessness of HTML that is causing the whole page to execute at once. I considered using 'output buffering' but I don't really understand what output buffering is for.
I was also considering using jQuery and Ajax to continuously poll getBatchReport(), but the problem is that I need to call this page from another location and as sendBatch() grows the 5 second delay might go up, probably to about 2 minutes. I don't think Ajax will work if I call this page remotely (this page is already being called in the background spawned by
/dev/null 2>&1 &
).
I have no control over the remote server specified in sendBatch routine and as far as I know it doesn't have any callback functions. I would prefer not to use CRON because that would mean I have to query the remote server the whole time.
Any ideas?
I was overly optimistic when I thought 5 seconds would do the job. Upon retesting I found that actually 15 seconds is a more realistic value. It's working now.
I have a PHP script that takes about 10 minutes to complete.
I want to give the user some feedback as to the completion percent and need some ideas on how to do so.
My idea is to call the php page with jquery and the $.post command.
Is there a way to return information from the PHP script without ending the script?
For example, from my knowledge of this now, if I return the variable, the PHP script will stop running.
My idea is to split the script into multiple PHP files and have the .post run each after a return from the previous is given.
But this still will not give an accurate assessment of time left because each script will be a different size.
Any ideas on a way to do this?
Thanks!
You can echo and flush() output, but that's suboptimal and rather fragile solution.
For long operations it might be good idea to launch script in the background and store/updte script status in shared location.
e.g. you could lanuch script using fopen('http://… call, proc_open PHP CLI process or even just openg long-running script in an <iframe>.
You could store status in the database or in shared memory (using apc_store()).
This will let user to check status of the script at any time (by refreshing page, or using AJAX) and user won't lose track of the script if browser's connection times out.
It also lets you avoid starting same long script twice.