PHP request "in background" - php

Is there a way to make PHP request where user doesn't have to wait for response? Some sort of "php request in background"?
For example, if application needs to send 100 emails because the user had submitted something, I don't want to show "sending... please wait" for this user, but I want some other script to do the job independent from that user...

Options:
Stick in a db (or a file) and use a cron to poll it (easiest as you probably already use a db).
Use something like RabbitMQ or ØMQ (my favourite)
Spawn a separate process to do it using fork/exec (would not recommend that).
As others have suggested - fake it by using an Ajax request. Viable - but I find it ugly.

You can maybe send the request in Ajax this way to UI won't freeze and the task will be executed on the server?

You could send the request via ajax and then redirect the user elsewhere upon success. The server script will still process, but no confirmation will be given to the user.

exec('php script.php > /dev/null & echo $!', $o);
You may want to use php-cli instead of php as well.
The command above returns the process id of the background script in $o[0] so you can set something up to poll it with ajax or somethig if you want to show the user it's progress.

Create a php routine to do the actual work of sending the emails and call it via http GET using the technique described here.

Related

Javascript getting progress of PHP-script

I need to provide interaction between js on html-page and php-script.
- Use AJAX.
Ok. But the problem is that php-script executing for a long time, and i need to know the state of this script processing (e.g. 60% complete)
What should i do? Create 2 php-scripts (client&server) and do ajax-request to client.php which will do requests to server.php via sockets or smth?
Are there more elegant solutions?
What if you had the script doing the processing write its status to a file once in awhile. Make a second script that will read the file and return the status of the original one.
You should never have a long-running process being executed entirely within an HTTP session.
A simple and common approach to this problem is message queuing. Basically, you have your UI queue up the request into a database table and then have external daemon(s) process the queue.
To provide feedback, have the daemon periodically update the table with the status for the row it's currently working on. Then, your javascript code can make AJAX requests to a script that retrieves the status for that work item from the database and displays it to the user.
See: Dealing with long server-side operations using ajax?
Ajax call php script and return information that script is runing.
Main script create lock.file.
Script called from cron is checking if lock.file exists and run the correct script.
The correct script saves the current progress into progress.txt.
Ajax is reading progress.txt and when progress is 100% then return information that script processing is finished.
edited: Thanks to Justin for poiting the timeout problem ;)
If you want to be really fancy, write output from the php script to stdout, and capture it via a pipe. This would require running the php script using exec() or proc_open() (http://php.net/manual/en/function.proc-open.php) and pipe the output to a file (or if you want to be extra-extra fancy, use node.JS to listen for that data)
There are quite a few ways to accomplish this:
Node.JS
An Ajax query every x seconds
A META/Javascript page reload
An iframe that is routinely reloading with the status in it.
Good luck!
You could use PHP's output buffering (see ob_flush) to flush the contents at certain points in your script and tailor your JavaScript so that it uses the flushed contents. I believe readyState in your AJAX call won't be set to 4 on flushes so that's where you'll have to handle that yourself (see this article). I think its a much nicer way than writing to a file and checking the contents of that.
on your process.php:
// 1st task
$_SESSION['progress'] = 0;
// your code for the first task here ...
// 2nd task
$_SESSION['progress'] = 10;
// you code for 2nd task ...
// 3rd task
$_SESSION['progress'] = 17;
// continue ...
// everything finished?
$_SESSION['progress'] = 100;
on your progress.php:
// simply output
echo $_SESSION['progress'];
now from your client-side, just make a request to your progress.php, receive the number and give it to your progress bar ...
didn't check that by myself, but hope that it works! :)

php cron job execute javascript as well

I have a cron job running a php script, but theres some html and javascript that I need to execute for the actual script to work.
Converting the javascript to php isnt an option.
Basically I need it to act as though a person is viewing the page every time the cronjob runs.
EDIT:
the script uses javascript from a different site to encrypt some passwords so it is able to log into my account on the site, and the javascript is thousands of lines. The way the script flows is: Send data to website>get the data it sends back>use sites javascript to alter data>set html form value to value of data returned by javascript function>submit html form to get info back to php>send data to log me in. I know the code is very shoddy but its the only way i could think to do it without having to rewrite all the javascript they use to encrypt the password to php
Yau can try Node.JS to run JavaScript code on the server.
install your favorite web browser, and then have the cron job run the browser with the url as an argument.
something like
/usr/bin/firefox www.example.com/foo.html
you'll probably want to wait a minute or so and then kill the processes, or determine a better way to find when it finishes.
cronjobs always runs on server side only. when there is no client side - how can you expect javascript to work really???
anyway solution is: use cronjob to run another php script - which in fact calls the php script you want to run using CURL.
e.g. file1.php - file you want to execute and expect the javascript on that page to work.
file2.php - another file you create ... in this file use curl to call the file1.php ( make sure you provide full http:// path like you type in browser - you can pass values like get/post methods on html forms do as well ). in your cronjob - call file2.php.
Make sure curl is available and not any firewall rule blocking http calls i.e. port 80 calls to same server. Most of the servers both conditions above are fulfilled.
---------- sorry guys - Kristian Antonsen is right - so dont consider this as full answer at the moment. However I am leaving this on as someone might have food for thoughts from this -----

Execute PHP script in background... what is the best solutions for my case?

I am working on a emailer system that I have to plug in a CMS for a company. I am looking for a way to make the email sender script run in background while the admin may navigate or even close the browsers.
I found the PHP function named ignore_user_abort() which is required to keep the page running even if it has timed-out.
Now I have three solutions to "start the script" :
I could use an iframe
I could use an Ajax call which I have previously configured to timeout very early. Eg: using the jQuery framework : $.ajaxSetup({ timeout: 1000 });
I could use a cron job but I would like to avoid this solution since they are "virtual" and are unstable on that server.
Is there any other solutions? I dont really like the iframe solution but I already use them with an Ajax uploader script.
I dont want the admin to hit F5 and start a second instance of it.
The company's users have been told to only log in the CMS using Firefox.
Thank you
You can run a PHP script in the background using Exec().
php docs provide an example on how you can do that:
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
If I were you, I would write a background daemon to do this. I would save all my emails to a database in the original HTTP call and have a script constantly running in the background that checks every, say, 30 seconds to see if there are any pending emails and then sends them.
This would also allow you to check for duplicates when you add emails to the queue, so you needn't worry about multiple additions.
Edit Given your reason for not using cron jobs (i.e. you don't have shell access to the server) I wouldn't use this approach.
Instead I would have a first AJAX call where the PHP inserted all the emails into a database, and returned a key to the Javascript. I would then have a Javascript call, using that key, where PHP processed emails for 5 seconds, then returned the number of emails remaining to the browser. I would then have Javascript keep repeating this request until there were no emails remaining.
This would have the additional benefit that you could have progress notifications and, if the server crashed for whatever reason, your emails would be recoverable.
Just because you close the browser the script started by php should not stop executing. So as I see it you have a couple of ways to do this:
Send a POST to the server telling it you want to do some action, and the script executes until it finishes. Reloading the page will warn the browser about that you will need to resend data etc... You could do the POST asynchronously with javascript to prevent this and to be able to do other things while the script is executing. One thing to watch out for here is that the script execution time might be limited.
Send a POST to a script that in turn starts a background system process doing the same thing. But now you will not be able to find out the progress unless you save the state somewhere, in a database or something
Cron (you already seem to have explored this option)

How to create a http response for a request associated with a different PHP session?

Is it possible to create a http response for requests associated with a different PHP session? If so, how to do that?
I'm creating a script language to make it easier for PHP developers to handle phone interactions. My application receives phone calls and then activates the user scripts associated with those calls.
Scripts are processed in real time. Since I don't know the number of commands in the scripts, I have to create and send individual REST responses for each command until somebody issues a hangup.
Is there any way to do that without having to stop the current function, send the response, and then resume the script the next time the phone server sends me a request?
Ideally, I would love to remain in the current PHP function sending responses for each http request without having to stop at each time... would curl -- or anything else -- help me with that?
Thanks in advance,
Leo
I hope i got right what you're trying to do,
well, if you have the desired other session's id, just use session_id() function to set which session actually used is used, for example:
#Assuming url like www.example.com/?session_id=123456;
$current_session_id = session_id();
$desired_session_id = $_GET['session_id'];
session_id($desired_session_id); #that's where we switch to the desired session
var_dump($_SESSION); #will dump session with id 123456
session_id($current_session_id); #switching back to the previous session id
This should work, I believe that I used something like this in the past, but double check me on that.
You can use multi-curl to send multiple requests, but it seems like you need to send the commands in order. Another option would be to use forking, which creates a copy of the current process. You can check if you are the "parent" or "child" process. The "parent" process then monitors the "child" processes. I've done this with up 50 child processes running at once off of 1 parent process.
http://php.net/manual/en/function.pcntl-fork.php
If you just want to send the command, and don't care about the output, just use the exec function. In the example below, PHP won't wait for a response or for the script to complete.
exec("send_command.php > /dev/null &");

On form submit background function php running

I am looking for a way to start a function on form submit that would not leave the browser window waiting for the result.
Example:
User fills in the form and press submit, the data from the form via javascript goes to the database and a function in php that will take several seconds will start but I dont want the user to be left waiting for the end of that function. I would like to be able to take him to another page and leave the function doing its thing server side.
Any thoughts?
Thanks
Thanks for all the replies...
I got the ajax part. But I cannot call ajax and have the browser move to another page.
This is what I wanted.
-User fills form and submits
-Result from the form passed to database
-long annoying process function starts
-user carries on visiting the rest of the site, independent of the status on the "long annoying process function"
By the way and before someone suggests it. No, it cannot be done by cron job
Use AJAX to call the php script, and at the top of the script turn on ignore_ user_ abort.
ignore_user_abort(true);
That way if they navigate away from the page the script will continue running in the backround. You can also use
set_time_limit(0);
to set a time limit, useful if you know your script will take a while to complete.
The most common method is:
exec("$COMMAND > /dev/null 2>&1 &");
Ah, ok, well you're essentially asking therefore, does PHP support threading, and the general answer is no... however...
there are some tricks you can perform to mimick this behaviour, one of which is highlighted above and involves forking out to a separate process on the server, this can be acheived in a number of ways, including the;
exec()
method. You also may want to look here;
PHP threading
I have also seen people try to force a flush of the output buffer halfway through the script, attempting to force the response back to the client, I dont know how successful this approach is, but maybe someone else will have some information on that one.
This is exactly what AJAX (shorthand for asynchronous JavaScript + XML) is for;
AJAX Information
It allows you to code using client side code, and send asynchronous requests to your server, such that the user's browser is not interuppted by an entire page request.
There is alot of information relating to AJAX out there on the web, so take a deep breath and get googling!
Sounds like you want to use some of the features AJAX (Asynchronous Javascript and XML - google) have to offer.
Basically, you would have a page with content. When a user clicks a button, javascript would be used to POST data to the server and begin processing. Simultaneously, that javascript might load a page from the server and then display it (eg, load data, and then replace the contents of a DIV with that new page.)
This kind of thing is the premise behind AJAX, which you see everywhere when you have a web page doing multiple things simultaneously.
Worth noting: This doesn't mean that the script is running "in the background on the server." Your web browser is still maintaining a connection with the web server - which means that the code is running in the "background" on the client's side. And by "background" we really mean "processing the HTTP request in parallel with other HTTP requests to give the feel of a 'background' running process"

Categories