I have a bash script that can take hours to finish.
I have a web frontend for it to make it easy to use.
On this main page, I wanna have a url that I press that starts my php command
<?exec('myscript that take a long time');?>
After the exec has finished, I want it to load a cookie.
setcookie('status',"done");
This is all easily done and works as is. However, the url that loads my exec command is a blank white page. I dont want this. I want the url to be an action which starts my phpscript and sets the cookie when the exec command returns all in the background.
Is this possible?
If not, how close can I get to this behavior.
EDIT:
function foo(){
var conn = new Ext.data.Connection();
conn.request({
url:‘request.php’,
method:‘POST’,
success: function(responseObject) {
alert(“Hello,Word!”);
},
failure: function() {
alert(“Something fail”);
}
});}
I have tried the above code with no luck.
I have a bash script that can take hours to finish
Stop there. That's your first problem. The WWW is not designed to maintain open requests for more than a couple of minutes. Maybe one day (since we now have websockets) but even if you know how to configure your webserver so this is not an off-switch for anyone passing by, it's exceeding unlikely that the network in between or your browser will be willing to wait this long.
Running of this job cannot be done synchronously with a web request. It must be run asynchronously.
By all means poll the status of the job from the web page (either via a meta-refresh or an ajax) but I'm having trouble understanding the benefit of setting a cookie when it has completed; usually for stuff like this I'd send out an email from the task when it completes. You also need a way to either separate out concurrent tasks invoked like this or a method of ensuring that only one runs at a time.
One solution would be to pass the PHP session id as an argument to the script, then have the script write a file named with the session id on completion - or even provide partial updates via the file - then you web page can poll the status of the job using the session id. Of course your code should check there isn't already an instance of the job running before starting a new one.
While the other answer are correct and the WWW is not meant for open requests, we have to consider the www was never "meant" to take information.
As programmers, we made it take information such as logins.
Further more, my question was simple, how do I commit action A with result B. While sending and email would be nice and dandy as the other post by #symcbean suggests, its not a solution but a sidestep to the problem.
Web applications often need to communicate with the webserver to update their status. This is an oxymoron because the websterver is stateless. Cookies are the soultion.
Here was my solution:
$.ajax({
url: url,
data: data,
success: success,
dataType: dataType
});
setcookie('status',"done");
The url is a php function page with an if statement acting as a switch statement and running my external script that takes a really long time. This call is blocking, that is, it will not execute setcookie until it has finished.
Once it does, it will set my cookie and the rest of my web application can continue working.
Related
Requirement:
I need to run a background process (per a user request) that takes about 30 to 60 seconds to complete. I'd like to give the user some status feedback. Note: Toly is right, 'Background' is not required.
What's working:
The process prints about 20 status messages during this time and I retrieve them with a proc_open and listening on a read pipe using fgets. I can save those messages into a session var and using timestamps (to help debug) I can see that the session array is getting written to with these messages as the process progresses.
The Trouble:
My plan was to poll the server with ajax calls (every sec) to retrieve these session vars for display in the DOM. The bottleneck seems to be that the server cannot service the ajax request while it's still running the background process. Everything dumps out at once when the background process completes. From what I can tell, the issue is not with output buffering because using (debugging) timestamps saved with each process message shows the server is writing to the session var sequentially, so that's how I know the proc_open and pipe reads are working as I expect. The issue appears to be the server not being able to give the AJAX request it's JSON object until it is done with the process; or, probably more accurately, done with the loop that is reading the pipe.
Obvious Misconception:
I thought sending a process to the background (using &) might give me a solution here. Apparently I do not know the difference between a background process and a forked process. What benefit is gained - if any - by running a process in the background when doing so appears to make no difference to me in this scenario?
Possible Solutions:
I do not expect the user initiated process that runs this
process/scenario to be that heavy, but if there's something I can
build into this solution that would help a heavy load then I would
like to do that now.
Is this a multi-threading (pthreads) or a
multi-process (fork) solution?
Or, should I save a process id,
let go polling it with a while( .. fgets ..) statement and then
come back to the process after the server has serviced the ajax
request?
I suppose I could run fake status messages and then
response accurately when the results come back after completion.
The time to process the request is not dependent upon the user, so
my fake timing could be pretty accurate. However, I would like to
know what the solution would be to provide real-time feedback.
After google-ing one day for a technique to get the same behavior you are describing here I come up with an easy solution for my project.
A bit of important theory:
- session_start () and a set like $_SESSION["x"] = "y" will always lock the session file.
Case scenario:
- A - process.php - running through an ajax call
- B - get_session.php - a second ajax call;
The main problem is/was, that even if you set a $_SESSION inside a process that is being run through an AJAX it will always have to wait the for the session file to get unlocked and it will result into a sync between the two processes (A. + B.) - both finishing at the same time!
So, the easiest way to fix this matter and get a good result is by using session_write_close() after each set. E.g.:
%_SESSION["A"] = "B";
$_SESSION["x"] = "y";
session_write_close();
PS: Best approach is to have a customed set of functions to handle the sessions.
Sorry for the mark-up. I just created an stack account.
Why would you think that you need a background process? Also, where did you get the idea that you needed one?
A normal php script, with sufficient time out set, with flush() function used every step of the way will give you the output you need for your AJAX.
What's even easier, since you use sessions - AJAX request to a separate handler, that will just check what's in session, and if there is smth new - will return you the new part.
$_SESSION['progress'] = array();
inside process.php
$_SESSION['progress'][] = 'Done 5%';
// complete some commands
$_SESSION['progress'][] = 'Done 10%';
inside ajax.php
if(count($_SESSION['progress']) > $_GET['laststep']) {
// echo the new messages
}
inside your normal page
$.ajax('ajax.php', 'GET', 'laststep=1', success: function(data){ show(data);})
Something like that should work.
I have a cron job running a php script, but theres some html and javascript that I need to execute for the actual script to work.
Converting the javascript to php isnt an option.
Basically I need it to act as though a person is viewing the page every time the cronjob runs.
EDIT:
the script uses javascript from a different site to encrypt some passwords so it is able to log into my account on the site, and the javascript is thousands of lines. The way the script flows is: Send data to website>get the data it sends back>use sites javascript to alter data>set html form value to value of data returned by javascript function>submit html form to get info back to php>send data to log me in. I know the code is very shoddy but its the only way i could think to do it without having to rewrite all the javascript they use to encrypt the password to php
Yau can try Node.JS to run JavaScript code on the server.
install your favorite web browser, and then have the cron job run the browser with the url as an argument.
something like
/usr/bin/firefox www.example.com/foo.html
you'll probably want to wait a minute or so and then kill the processes, or determine a better way to find when it finishes.
cronjobs always runs on server side only. when there is no client side - how can you expect javascript to work really???
anyway solution is: use cronjob to run another php script - which in fact calls the php script you want to run using CURL.
e.g. file1.php - file you want to execute and expect the javascript on that page to work.
file2.php - another file you create ... in this file use curl to call the file1.php ( make sure you provide full http:// path like you type in browser - you can pass values like get/post methods on html forms do as well ). in your cronjob - call file2.php.
Make sure curl is available and not any firewall rule blocking http calls i.e. port 80 calls to same server. Most of the servers both conditions above are fulfilled.
---------- sorry guys - Kristian Antonsen is right - so dont consider this as full answer at the moment. However I am leaving this on as someone might have food for thoughts from this -----
I am working on a emailer system that I have to plug in a CMS for a company. I am looking for a way to make the email sender script run in background while the admin may navigate or even close the browsers.
I found the PHP function named ignore_user_abort() which is required to keep the page running even if it has timed-out.
Now I have three solutions to "start the script" :
I could use an iframe
I could use an Ajax call which I have previously configured to timeout very early. Eg: using the jQuery framework : $.ajaxSetup({ timeout: 1000 });
I could use a cron job but I would like to avoid this solution since they are "virtual" and are unstable on that server.
Is there any other solutions? I dont really like the iframe solution but I already use them with an Ajax uploader script.
I dont want the admin to hit F5 and start a second instance of it.
The company's users have been told to only log in the CMS using Firefox.
Thank you
You can run a PHP script in the background using Exec().
php docs provide an example on how you can do that:
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
If I were you, I would write a background daemon to do this. I would save all my emails to a database in the original HTTP call and have a script constantly running in the background that checks every, say, 30 seconds to see if there are any pending emails and then sends them.
This would also allow you to check for duplicates when you add emails to the queue, so you needn't worry about multiple additions.
Edit Given your reason for not using cron jobs (i.e. you don't have shell access to the server) I wouldn't use this approach.
Instead I would have a first AJAX call where the PHP inserted all the emails into a database, and returned a key to the Javascript. I would then have a Javascript call, using that key, where PHP processed emails for 5 seconds, then returned the number of emails remaining to the browser. I would then have Javascript keep repeating this request until there were no emails remaining.
This would have the additional benefit that you could have progress notifications and, if the server crashed for whatever reason, your emails would be recoverable.
Just because you close the browser the script started by php should not stop executing. So as I see it you have a couple of ways to do this:
Send a POST to the server telling it you want to do some action, and the script executes until it finishes. Reloading the page will warn the browser about that you will need to resend data etc... You could do the POST asynchronously with javascript to prevent this and to be able to do other things while the script is executing. One thing to watch out for here is that the script execution time might be limited.
Send a POST to a script that in turn starts a background system process doing the same thing. But now you will not be able to find out the progress unless you save the state somewhere, in a database or something
Cron (you already seem to have explored this option)
I would like to run a PHP script from another PHP script so that the parent comes back with a success when it initiates the child script.
The parent will be initiated from the browser where I am using sessions. Will the child script be able to make use of the same session and session variables if run via exec?
Thanks all for any help
Scenario
I am firing off the parent via AJAX. After I do this, I want parent to run the child and come back. The child script will take a while to complete. The parent will return a success to indicate it has run the child. The user will then be redirected to a different page.
Like the others said, it will not work.
But, you can do ignore_user_abort( true );
This will keep you script running even if the user closes their browser.
So, ignore_user_abort, include the massive child script within the parent so you get the session vars and everything, and you should be fine.
If you want to run a PHP script from a PHP script why not just do:
require 'child.php';
?
If you need to do something in the background, use AJAX to fire off the request.
Edit: there is no reason an AJAX request couldn't be long-running but you're getting outside the realm of things that PHP was really designed for. But anyway, fire off an AJAX request. If it takes 20 minutes to come back, that's no drama.
Alternatively you can fire off an AJAX request every 15 seconds (pick a number) to check on the status of what you've started.
For truly long-running tasks you're probably going to have to take a "fire and forget" approach. Start it off and return immediately. But it won't have the session information. You'll need to store that.
I'd suggest having some kind of persistence mechanism like a Jobs table:
Job: id, started_by, status (not started, running, complete), started_when, completed_when.
and rather than firing off such jobs as an exec() have a cron job that looks for jobs that need to be started and start them. This will be less fragile than a Webserver triggered approach.
You'll also have the status reporting you need to be able to ask if a job is finished yet.
The answer is no
I have done some google search on this topic and couldn't find the answer to my question.
What I want to achieve is the following:
the client make an asynchronous call to a function in the server
the server runs that function in the background (because that function is time consuming), and the client is not hanging in the meantime
the client constantly make a call to the server requesting the status of the background job
Can you please give me some advices on resolving my issue?
Thank you very much! ^-^
You are not specifying what language the asynchronous call is in, but I'm assuming PHP on both ends.
I think the most elegant way would be this:
HTML page loads, defines a random key for the operation (e.g. using rand() or an already available session ID [be careful though that the same user could be starting two operations])
HTML page makes Ajax call to PHP script to start_process.php
start_process.php executes exec /path/to/scriptname.php to start the process; see the User Contributed Notes on exec() on suggestions how to start a process in the background. Which one is the right for you, depends mainly on your OS.
long_process.php frequently writes its status into a status file, named after the random key that your Ajax page generated
HTML page makes frequent calls to show_status.php that reads out the status file, and returns the progress.
Have a google for long running php processes (be warned that there's a lot of bad advice out there on the topic - including the note referred to by Pekka - this will work on Microsoft but will fail in unpredicatable ways on anything else).
You could develop a service which responds to requests over a socket (your client would use fsockopen to connect) - some simple ways of acheiving this would be to use Aleksey Zapparov's Socket server (http://www.phpclasses.org/browse/package/5758.html) which handles requests coming in via a socket however since this runs as a single thread it may not be very appropriate for something which requiers a lot of processing. ALternatively, if you are using a non-Microsoft system then yo could hang your script off [x]inetd however, you'll need to do some clever stuff to prevent it terminating when the client disconnects.
To keep the thing running after your client disconnects then the PHP code must be running from the standalone PHP executable (not via the webserver) Spawn a process in a new process group (see posix_setsid() and pcntl_fork()). To enable the client to come back and check on progress, the easiest way to achieve this is to configure the server to write out its status to somewhere the client can read.
C.
Ajax call run method longRunningMethod() and get back an idendifier (e.g an id)
Server runs the method, and sets key in e.g. sharedmem
Client calls checkTask(id)
server lookup the key in sharedmem and check for ready status
[repeat 3 & 4 until 5 is finished]
longRunningMethod is finished and sets state to finished in sharedmem.
All Ajax calls are per definition asynchronous.
You could (although not a strictly necessary step) use AJAX to instantiate the call, and the script could then create a reference to the status of the background job in shared memory (or even a temporary entry in an SQL table, or even a temp file), in the form of a unique job id.
The script could then kick off your background process and immediately return the job ID to the client.
The client could then call the server repeatedly (via another AJAX interface, for example) to query the status of the job, e.g. "in progress", "complete".
If the background process to be executed is itself written in PHP (e.g. a command line PHP script) then you could pass the job id to it and it could provide meaningful progress updates back to the client (by writing to the same shared memory area, or database table).
If the process to executed it's not itself written in PHP then I suggest wrapping it in a command line PHP script so that it can monitor when the process being executed has finished running (and check the output to see if was successful) and update the status entry for that task appropriately.
Note: Using shared memory for this is best practice, but may not be available if you are using shared hosting, for example. Don't forget you want to have a means to clean up old status entries, so I would store "started_on"/"completed_on" timestamps values for each one, and have it delete entries for stale data (e.g. that have a completed_on timestamp of more than X minutes - and, ideally, that also checks for jobs that started some time ago but were never marked as completed and raises an alert about them).