Related
Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);
I have a PHP Code that does some tasks.
Lets say someone executes the code by doing so https://localhost/code.php.
I have an employee that executes the script over curl from a separate server, what is the best way to prevent him from launching the script twice, before the (already running) script is actually completed/finished goes to the end.
TLDR: I would need a function, to wait until the task/code (that's running now) completes and the secondary task that is trying to be launched has given (sleep for few seconds or until the first tasks completes).
TLDR2: Looking for function [The title says it]
Any ideas? thanks.
While a session won't work with cURL, the idea is valid -- you need to set something persistent outside of your script. So, how about writing to a local file, or writing to a database?
if ( file_exists('lock.txt') ) die;
file_put_contents ('lock.txt', 'This file prevents script execution', LOCK_EX);
(... your script code here...)
unlink ('lock.txt');
If you know that there is only one user who will hit your server you can simply use session data.
<?php
session_start();
if (true === $_SESSION["NOT_FINISHED"] ?? false) {
die("Previous job is not finished yet!");
} else {
$_SESSION["NOT_FINISHED"] = true;
// start whatever job need to be done here
...
// when job is done and finished lets release out busy flag
unset( $_SESSION["NOT_FINISHED"]);
}
I'm going on a limb here; I'm trying to direct a long running script to Artisan. Is it possible for App::call() to return a string value or maybe even send an email once the long running script finishes?
I'm trying to look for more info on this, but is it right to assume that if Artisan is running I can redirect the user to something like a waiting page, maybe a looping gif?
Use Queue::push() with an appropriate driver (database, perhaps) to push the long-running job to a queue.
The last thing the long-running job should do is send some indication that it's finished.
Here's some sample code:
Queue::push(function($job) use ($id)
{
Artisan::call('my-command', ['arg1', 'arg2']);
$job->delete();
});
// Then at the end of your my-command script:
$jobModel = LongRunningJob::find($id);
$jobModel->finishedDate = Carbon::now();
$jobModel->save();
Of course you can then poll the database to determine whether the long-running command has finished.
Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);
I have a list of data that needs to be processed. The way it works right now is this:
A user clicks a process button.
The PHP code takes the first item that needs to be processed, takes 15-25 secs to process it, moves on to the next item, and so on.
This takes way too long. What I'd like instead is that:
The user clicks the process button.
A PHP script takes the first item and starts to process it.
Simultaneously another instance of the script takes the next item and processes it.
And so on, so around 5-6 of the items are being process simultaneously and we get 6 items processed in 15-25 secs instead of just one.
Is something like this possible?
I was thinking that I use CRON to launch an instance of the script every second. All items that need to be processed will be flagged as such in the MySQL database, so whenever an instance is launched through CRON, it will simply take the next item flagged to be processed and remove the flag.
Thoughts?
Edit: To clarify something, each 'item' is stored in a mysql database table as seperate rows. Whenever processing starts on an item, it is flagged as being processed in the db, hence each new instance will simply grab the next row which is not being processed and process it. Hence I don't have to supply the items as command line arguments.
Here's one solution, not the greatest, but will work fine on Linux:
Split the processing PHP into a separate CLI scripts in which:
The command line inputs include `$id` and `$item`
The script writes its PID to a file in `/tmp/$id.$item.pid`
The script echos results as XML or something that can be read into PHP to stdout
When finished the script deletes the `/tmp/$id.$item.pid` file
Your master script (presumably on your webserver) would do:
`exec("nohup php myprocessing.php $id $item > /tmp/$id.$item.xml");` for each item
Poll the `/tmp/$id.$item.pid` files until all are deleted (sleep/check poll is enough)
If they are never deleted kill all the processing scripts and report failure
If successful read the from `/tmp/$id.$item.xml` for format/output to user
Delete the XML files if you don't want to cache for later use
A backgrounded nohup started application will run independent of the script that started it.
This interested me sufficiently that I decided to write a POC.
test.php
<?php
$dir = realpath(dirname(__FILE__));
$start = time();
// Time in seconds after which we give up and kill everything
$timeout = 25;
// The unique identifier for the request
$id = uniqid();
// Our "items" which would be supplied by the user
$items = array("foo", "bar", "0xdeadbeef");
// We exec a nohup command that is backgrounded which returns immediately
foreach ($items as $item) {
exec("nohup php proc.php $id $item > $dir/proc.$id.$item.out &");
}
echo "<pre>";
// Run until timeout or all processing has finished
while(time() - $start < $timeout)
{
echo (time() - $start), " seconds\n";
clearstatcache(); // Required since PHP will cache for file_exists
$running = array();
foreach($items as $item)
{
// If the pid file still exists the process is still running
if (file_exists("$dir/proc.$id.$item.pid")) {
$running[] = $item;
}
}
if (empty($running)) break;
echo implode($running, ','), " running\n";
flush();
sleep(1);
}
// Clean up if we timeout out
if (!empty($running)) {
clearstatcache();
foreach ($items as $item) {
// Kill process of anything still running (i.e. that has a pid file)
if(file_exists("$dir/proc.$id.$item.pid")
&& $pid = file_get_contents("$dir/proc.$id.$item.pid")) {
posix_kill($pid, 9);
unlink("$dir/proc.$id.$item.pid");
// Would want to log this in the real world
echo "Failed to process: ", $item, " pid ", $pid, "\n";
}
// delete the useless data
unlink("$dir/proc.$id.$item.out");
}
} else {
echo "Successfully processed all items in ", time() - $start, " seconds.\n";
foreach ($items as $item) {
// Grab the processed data and delete the file
echo(file_get_contents("$dir/proc.$id.$item.out"));
unlink("$dir/proc.$id.$item.out");
}
}
echo "</pre>";
?>
proc.php
<?php
$dir = realpath(dirname(__FILE__));
$id = $argv[1];
$item = $argv[2];
// Write out our pid file
file_put_contents("$dir/proc.$id.$item.pid", posix_getpid());
for($i=0;$i<80;++$i)
{
echo $item,':', $i, "\n";
usleep(250000);
}
// Remove our pid file to say we're done processing
unlink("proc.$id.$item.pid");
?>
Put test.php and proc.php in the same folder of your server, load test.php and enjoy.
You will of course need nohup (unix) and PHP cli to get this to work.
Lots of fun, I may find a use for it later.
Use an external workqueue like Beanstalkd which your PHP script writes a bunch of jobs too. You have as many worker processes pulling jobs from beanstalkd and processing them as fast as possible. You can spin up as many workers as you have memory / CPU. Your job body should contain as little information as possible, maybe just some IDs which you hit the DB with. beanstalkd has a slew of client APIs and itself has a very basic API, think memcached.
We use beanstalkd to process all of our background jobs, I love it. Easy to use, its very fast.
There is no multithreading in PHP, however you can use fork.
php.net:pcntl-fork
Or you could execute a system() command and start another process which is multithreaded.
can you implementing threading in javascript on the client side? seems to me i've seen a javascript library (from google perhaps?) that implements it. google it and i'm sure you'll find something. i've never done it, but i know its possible. anyway, your client-side javascript could activate (ajax) a php script once for each item in separate threads. that might be easier than trying to do it all on the server side.
-don
If you are running a high traffic PHP server you are INSANE if you do not use Alternative PHP Cache: http://php.net/manual/en/book.apc.php . You do not have to make code modifications to run APC.
Another useful technique that can work along with APC is using the Smarty template system which allows you to cache output so that pages do not have to be rebuilt.
To solve this problem, I've used two different products; Gearman and RabbitMQ.
The benefit of putting your jobs into some sort of queuing software like Gearman or Rabbit is that you have multiple machines, they can all participate in processing items off the queue(s).
Gearman is easier to setup, so I'd suggest poking around with it a bit first. If you find you need something more heavy duty with queue robustness; Look into RabbitMQ
http://www.danga.com/gearman/
http://pear.php.net/package/Net_Gearman (PEAR library)
You can use pcntl_fork() and family to fork a process - however you may need something like IPC to communicate back to the parent process that the child process (the one you fork'd) is finished.
You could have them write to shared memory, like via memcache or a DB.
You could also have the child process write the completed data to a file, that the parent process keeps checking - as each child process completes the file is created/written to/updated, and parent process can grab it, one at a time, and them throw them back to the callee/client.
The parent's job is to control the queue, to make sure the same data isn't processed twice and also to sanity check the children (better kill that runaway process and start over...etc)
Something else to keep in mind - on windows platforms you are going to be severely limited - I dont even think you have access to pcntl_ unless you compiled PHP with support for it.
Also, can you cache the data once its been processed, or is it unique data every time? that would surely speed things up..?