Related
Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);
I have a LOT ( almost 300 ) old SVN repositories to migrate to git using git2svn.
After considering GOLANG and PYTHON, I finally decided that the easiest way is to use PHP . Might be a bad questionable decision, but it's seemed easy.
So, after 15 minutes , I did have a script that is more or less running ok in tests . Ugly script , but it is a one-timer.
The problem is that the process takes a lot of time , even for simple almost empty repos is can take 30sec. and even a minute. On big ones - even 10min - so before taking it into production, I would like to have some feedback mechanism - so I can actually see what is going on .
..as of now ,the script does output the command feedback like so :
$cmd = "cd ".$GITrepoPath." && svn2git svn://127.0.0.1/". $repoName . " --username " .$SVNusername ." --authors authors.txt --notags --nobranches --notrunk";
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
..but this is only after each repo was finished processing .. not like the real cmd execution where I can see the steps .
The only question I found that might be close to what I need was here - but honestly - I did not understood much from the answer ...
I know it is just a one-timer script - but the use case had me interested in how to actually achieve that ( and if it is possible ).
I am on a win7 local machine , but would like to know also for *nix if possible .
shell_exec waits until the process closes. You have to create the process and listen to it, the same as CMD. Use exec function in this way:
$cmd = ''; // your command here
$output_storage = [];
$output_showed = [];
$result = null;
exec($cmd, $output_storage, $result);
while( $result === null ){
$diff = array_diff($output_storage, $output_showed);
if( $diff ){
// all new outputs here as $diff
$output_showed = $diff;
}
}
I suggest instead running a script or program in the background that runs the command and then updates a record in a database, you could then use AJAX or whatever to poll the server for record changes. This allows a nice environment for the user.
The column in the database table could be named something like "finished" and once that boolean is true then you know its complete and the output could be stored in the database.
Problem
I have a form that, when submitted, will run basic code to process the information submitted and insert it into a database for display on a notification website. In addition, I have a list of people who have signed up to receive these notifications via email and SMS message. This list is trivial as the moment (only pushing about 150), however it's enough to cause it takes upwards of a minute to cycle through the entire table of subscribers and send out 150+ emails. (The emails are being sent individually as requested by the system administrators of our email server because of mass email policies.)
During this time, the individual who posted the alert will sit on the last page of the form for almost a minute without any positive reinforcement that their notification is being posted. This leads to other potential problems, all that have possible solutions that I feel are less than ideal.
First, the poster might think the server is lagging and click the 'Submit' button again, causing the script to start over or run twice. I could solve this by using JavaScript to disable the button and replace the text to say something like 'Processing...', however this is less than ideal because the user will still be stuck on the page for the length of the script execution. (Also, if JavaScript is disabled, this problem still exists.)
Second, the poster might close the tab or the browser prematurely after submitting the form. The script will keeping running on the server until it tries to write back to the browser, however if the user then browses to any page within our domain (while the script is still running), the browser hangs loading the page until the script has ended. (This only happens when a tab or window of the browser is closed and not the entire browser application.) Still, this is less than ideal.
(Possible) Solution
I've decided I want to break out the "email" part of the script into a separate file I can call after the notification has been posted. I originally thought of putting this on the confirmation page after the notification has been successfully posted. However, the user will not know this script is running and any anomalies will not be apparent to them; This script cannot fail.
But, what if I can run this script as a background process? So, my question is this: How can I execute a PHP script to trigger as a background service and run completely independent of what the user has done at the form level?
EDIT: This cannot be cron'ed. It must run the instant the form is submitted. These are high-priority notifications. In addition, the system administrators running our servers disallow crons from running any more frequently than 5 minutes.
Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)
I'm using the following line to invoke the email script:
shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");
It is important to notice the & at the end of the command (as pointed out by #netcoder). This UNIX command runs a process in the background.
The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.
The email script then outputs to my log file using the >> and will output something like this:
[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)
On Linux/Unix servers, you can execute a job in the background by using proc_open:
$descriptorspec = array(
array('pipe', 'r'), // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'), // stderr
);
$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);
The & being the important bit here. The script will continue even if the original script has ended.
Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.
Example:
<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script
// Do stuff with received data
Note: Due to a wontfix quirk calling flush() after fastcgi_finish_request will cause it to exit without warning/error.
You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)
$connected = true;
// Stuff...
fastcgi_finish_request();
$connected = false;
// ...
if ($connected) {
flush();
}
Or
ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if you really shouldn't be calling flush() if you know you've disconnected from the user)
flush();
PHP exec("php script.php") can do it.
From the Manual:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.
And why not making a HTTP Request on the script and ignoring the response ?
http://php.net/manual/en/function.httprequest-send.php
If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.
The simpler way to run a PHP script in background is
php script.php >/dev/null &
The script will run in background and the page will also reach the action page faster.
How about this?
Your PHP script that holds the form saves a flag or some value into a database or file.
A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.
This second PHP script should be set to run as a cron.
As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.
About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:
* * * * * wget -O /dev/null http://www.example.com/spooler.php
Background cron job sounds like a good idea for this.
You'll need ssh access to the machine to run the script as a cron.
$ php scriptname.php to run it.
Assuming you are running on a *nix platform, use cron and the php executable.
EDIT:
There are quite a number of questions asking for "running php without cron" on SO already. Here's one:
Schedule scripts without using CRON
That said, the exec() answer above sounds very promising :)
If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).
The server can be written in anything really, you probably can easily do it in python.
Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.
Example server :
<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);
if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}
if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}
$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
// $data = json_decode($line);
// processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>
Example client :
<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}
function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>
If you're on Windows, research proc_open or popen...
But if we're on the same server "Linux" running cpanel then this is the right approach:
#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>
Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server
Lastly, on the script.php file which is called above, take note of this make sure you wrote:
<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php
//Code to be tested on background
ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible
str_repeat(" ",1500);
//.for progress bars or loading images
sleep(2); //standard limit
?>
For background worker, I think you should try this technique. It will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
form_action_page.php
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
// post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
// post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
// call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.
// Your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*/
function post_async($url,$params)
{
$post_string = $params;
$parts = parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out .= "Host: ".$parts['host']."\r\n";
$out .= "Content-Type: application/x-www-form-urlencoded\r\n";
$out .= "Content-Length: ".strlen($post_string)."\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?php
echo $_REQUEST["Keywordname"];//case1 Output > testValue
// here do your background operations it will not halt main page
?>
P.S: if you want to send url parameters as loop then follow this answer: https://stackoverflow.com/a/41225209/6295712
In my case I have 3 params, one of them is string (mensaje):
exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");
In my Process.php I have this code:
if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}
$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];
Use Amphp to execute jobs in parallel & asynchronously.
Install the library
composer require amphp/parallel-functions
Code sample
<?php
require "vendor/autoload.php";
use Amp\Promise;
use Amp\ParallelFunctions;
echo 'started</br>';
$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();
$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();
Promise\wait(Promise\all($promises));
echo 'finished';
Fo your use case, You can do something like below
<?php
use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;
$responses = wait(parallelMap([
'a#example.com',
'b#example.com',
'c#example.com',
], function ($to) {
return send_mail($to);
}));
This works for me. try this:
exec("php asyn.php > /dev/null 2>/dev/null &);
I need a function that executes by itself in php without the help of crone. I have come up with the following code that works for me well but as it is a never-ending loop will it cause any problem to my server or script, if so could you give me some suggestion or alternatives, please. Thanks.
$interval=60; //minutes
set_time_limit(0);
while (1){
$now=time();
#do the routine job, trigger a php function and what not.
sleep($interval*60-(time()-$now));
}
We have used the infinite loop in a live system environment to basically wait for incoming SMS and then process it. We found out that doing it this way makes the server resource intensive over time and had to restart the server in order to free up memory.
Another issue we encountered is when you execute a script with an infinite loop in your browser, even if you hit the stop button it will continue to run unless you restart Apache.
while (1){ //infinite loop
// write code to insert text to a file
// The file size will still continue to grow
//even when you click 'stop' in your browser.
}
The solution is to run the PHP script as a deamon on the command line. Here's how:
nohup php myscript.php &
the & puts your process in the background.
Not only we found this method to be less memory intensive but you can also kill it without restarting apache by running the following command :
kill processid
Edit: As Dagon pointed out, this is not really the true way of running PHP as a 'Daemon' but using the nohup command can be considered as the poor man's way of running a process as a daemon.
You can use time_sleep_until() function. It will return TRUE OR FALSE
$interval=60; //minutes
set_time_limit( 0 );
$sleep = $interval*60-(time());
while ( 1 ){
if(time() != $sleep) {
// the looping will pause on the specific time it was set to sleep
// it will loop again once it finish sleeping.
time_sleep_until($sleep);
}
#do the routine job, trigger a php function and what not.
}
There are many ways to create a daemon in php, and have been for a very long time.
Just running something in background isn't good. If it tries to print something and the console is closed, for example, the program dies.
One method I have used on linux is pcntl_fork() in a php-cli script, which basically splits your script into two PIDs. Have the parent process kill itself, and have the child process fork itself again. Again have the parent process kill itself. The child process will now be completely divorced and can happily hang out in background doing whatever you want it to do.
$i = 0;
do{
$pid = pcntl_fork();
if( $pid == -1 ){
die( "Could not fork, exiting.\n" );
}else if ( $pid != 0 ){
// We are the parent
die( "Level $i forking worked, exiting.\n" );
}else{
// We are the child.
++$i;
}
}while( $i < 2 );
// This is the daemon child, do your thing here.
Unfortunately, this model has no way to restart itself if it crashes, or if the server is rebooted. (This can be resolved through creativity, but...)
To get the robustness of respawning, try an Upstart script (if you are on Ubuntu.) Here is a tutorial - but I have not yet tried this method.
while(1) means it is infinite loop. If you want to break it you should use break by condition.
eg,.
while (1){ //infinite loop
$now=time();
#do the routine job, trigger a php function and what no.
sleep($interval*60-(time()-$now));
if(condition) break; //it will break when condition is true
}
I was just wondering if it's possible to detect the current execution timing for a running script, I am creating an application to ping some computers on the network. As this is being done from a Linux machine the pinging system differs from Windows.
On a linux machine, if the computer is off then the server will hang on the primary message after issuing the ping command and not have any more output.. Will just hang (with my experience with linux pinging)
So I have this script:
$Computer_Array = array(
"Managers" => "192.168.0.5",
"Domain Controller" => "192.168.0.1"
"Proxy Controller" => "192.168.0.214"
);
foreach ($Computer_Array AS $Addresses){
exec('ping'.$Addresses, $Output);
}
Later on this will be used to display statistics.. Now the problem is, as the managers computer is subject to both power conditions such as on or off when issuing the ping command, just hangs.. So i'm wondering if there is a method to capture the microtime(); of the current executing function, if it exceeds a threshold then move on to the next element. I would rather keep this to core PHP, but if such solution can only be done via AJAX or another language, then I would have to consult the developer if it's alright to integrate an external method.
The ping command allows you to specify how long it will wait before giving up:
ping -c 5 -t 1 127.0.0.2
This will return after one second, regardless of how many pings have been sent. The exact command line arguments would vary between platforms.
Alternatively, if you can use pcntl, look into pcntl_alarm(); it will deliver a SIGALRM signal to your application after a certain amount of time that can be caught.
Lastly, and I haven't tested this myself, you could try using proc_open() and use stream_select() on one of the pipes; if nothing has happened on the pipe after a certain time you can then kill off the process.
If you want to do this with PHP, or run into a similar issue, here's an example using code from php execute a background process
The PHP script would need write permissions to the output files. This concept would essentially work for anything, from a ping to another PHP script.
function isRunning($pid){
try{
$result = shell_exec(sprintf("ps %d", $pid));
if( count(preg_split("/\n/", $result)) > 2){
return true;
}
}catch(Exception $e){}
return false;
}
$cmd = "ping 127.0.0.1";
$outputfile = "output";
$pidfile = "pid";
$start = microtime(true);
// Don't last longer than 10 seconds
$threshold = 2;
// Ping and get pid
exec(sprintf("%s > %s 2>&1 & echo $! > %s", $cmd, $outputfile, $pidfile));
$pid = `tail -n 1 $pidfile`;
// Let the process run until you want to stop it
while (isRunning($pid)){
// Check output here...
if ((microtime(true)-$start) > $threshold){
$o = `kill $pid`;
die("Timed out.");
}
}
$end = microtime(true);
$time = $end - $start;
echo "Finished in $time seconds\n";