Running PHP script in background - php

I am testing a php script that has been developed on an OS-X system at Debian and it behaves different there.
To reproduce it I wrote two scripts: parent.php and child.php:
parent.php:
#!/usr/bin/php
<?php
echo "parent started...\n";
shell_exec(__DIR__ . '/child.php &2>/dev/null &');
echo "parent finished.\n";
child.php:
#!/usr/bin/php
<?php
echo "child started...\n";
sleep(5);
echo "child finished.\n";
Running parent.php on OS-X I get back imediately the two output lines (parent started, parent finished). On Debian I get the "parent started..." line, then a delay of 5 seconds an then the "parent finished.". Running "./child.php &2>/dev/null &" in the shell gives me back the prompt imediately as expected. Any ideas how I can fix this?

This is because &2> part. It may not be supported in all systems. Also in every shell (bash, sh, ksh etc).
Try this,
exec("/bin/bash -c '/usr/bin/php /path/to/child.php 2> /dev/null' &");
If you want to suppress all the output use this,
exec("/bin/bash -c '/usr/bin/php /path/to/child.php &> /dev/null ' &");
BASH-HOWTO
Just tested, exec("/usr/bin/php /path/to/child.php > /dev/null 2>&1 &") should work too.

Try with exec() or system() instead of shell_exec, maybe shell_exec has not the same behavior on different OS.

Related

running php exec command in background

I am attempting to launch sar and have it run forever via a php script. But for whatever reason it never actually launches. I have tried the following:
exec('sar -u 1 > /home/foo/foo.txt &');
exec('sar -o /home/foo/foo -u 1 > /dev/null 2>&1 &');
However it never launches sar. If I just use:
exec('sar -u 1')
It works but it just hangs the php script. My understanding that if a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream.
I will assume your running this on a *nix platform. To get php to run something in the background and not wait for the process to finish I would recommend 2 things: First use nohup and also redirect the output of the command to /dev/null (trash).
Example:
<?php
exec('nohup sar -u 1 > /dev/null 2>/dev/null &');
nohup means we do not send the "hang up" signal (which kills the process) when the terminal running the command closes.
> /dev/null 2>/dev/null & redirects the "normal" and "error" outputs to the blackhole /dev/null location. This allows PHP to not have to wait for the outputs of the command being called.
On another note, if you are using PHP just to call a shell command, you may want to consider other options like Ubuntu's Upstart with no PHP component--if you are using Ubuntu that is.

Exec to launch php as background process

I am struggling with getting a php file to run in the background with PHP's exec(). As a first test, I tried :
exec("ls -l > logfile.txt 2> errfile.txt &");
That works fine. logfile.txt gets filled with a directory listing.
Per instructions in the php documentation, since the exec kicks off a process that runs in the background, standard out (and standard error) are redirected to a file.
Now, I try
exec("/usr/bin/php -f /path/to/my.php > logfile.txt 2> errorfile.txt &");
It appears nothing happens.
Here are test files that I'm trying:
alpha.php
<?php
$version="a";
// Go do something we do not need to wait for.
exec("/usr/bin/php -f /path/to/beta.php > logfile.txt 2> errorfile.txt &");
?>
<html>
<head><title>Test</title></head>
<body>
<p>This is Alpha version <?php echo $version; ?></p>
</body>
</html>
beta.php
<?php
if (!($fp = fopen('/home/johnst12/public_html/workshops/admin/betadata.txt', 'w'))) { exit;}
fprintf($fp, "Proof that Beta executed.");
fclose($fp);
?>
If I run beta.php directly, it works fine. Betadata.txt gets the message.
If I run alpha.php to launch beta.php, betadata.txt is not created. logfile.txt and errorfile.txt remain empty (expected).
I am sure that the path to php, and the path to my php file are correct.
Googling for clarification has not been fruitful. A couple of common themes seem to be (a) running out of resources? (b) lack of permission on the target php file? Out of resources seems unlikely. The permission on the script is global read 644 (rw-r--r--). I tried adding execute (755) just in case it would help. It made no difference.
PHP version 5.3.21
Linux/Apache system.
safe_mode Off
What am I missing? Thanks.
First of all : Have you verified that /usr/bin/php is the correct path to PHP?
Php doesn't like running like that. Something to do with stdin. Try with nohup:
exec("nohup /usr/bin/php -f /path/to/beta.php > logfile.txt 2> errorfile.txt &");
With -f anything else that looks like a flag will go to PHP, so if you wanted to pass a "-x" option to your script then you'd have to
/usr/bin/php -f /path/to/beta.php -- -x
Without, options before the filename go to PHP and after go to the script.
/usr/bin/php /path/to/beta.php -x
I assume you've already looked at the two files in case they have output or errors?
A few other things to check:
Delete the two files. Are they recreated each time this code runs?
exec("nohup /usr/bin/php -v > logfile.txt &");
should output version information to that log file.
exec("/usr/bin/php -f /path/to/beta.php > logfile.txt 2> errorfile.txt");
should run the script properly (but not in the background).

Php Exec timeout

I have the following exec() command with an & sign at the end so the script runs in the background. However the script is not running in the background. It's timing out in the browser after exactly 5.6 minutes. Also if i close the browser the script doesn't keep running.
exec("/usr/local/bin/php -q /home/user/somefile.php &")
If I run the script via the command line, it does not time out. My question is how do i prevent timeout. How do i run the script in the background using exec so it's not browser dependent. What am i doing wrong and what should i look at.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null 2>&1 &");
You'll need to use shell_exec() instead:
shell_exec("/usr/local/bin/php -q /home/gooffers/somefile.php &");
That being said, if you have shell access, why don't you install this as a cronjob? I'm not sure why a PHP script is invoking another to run like this.

How to execute a php script from another?

How to execute a php script from another ?
I want to execute 3 php scripts from my php file without waiting for the 3 scripts to finish. In other words, the 3 php files need to be executed all at once (parallel) instead of one-by-one (sequentiell).
The 3 scripts are in the same folder of my main php file (script).
If you do not want to wait for them to finish, run them with either
exec('php script.php &> /dev/null &');
shell_exec('php script.php &> /dev/null &');
system('php script.php &> /dev/null &');
`php script.php &> /dev/null &`
Any of those should accomplish the job, depending on your PHPs configuration. Although they are different functions, their behaviour should be similar since all output is being redirected to /dev/null and the proccess is immediately detached.
I use the first solution in a production environment where a client launches a bash SMSs sending script which can take up to 10 minutes to finish, it has never failed.
More info in: http://php.net/exec · http://php.net/shell_exec · http://php.net/system
how about using exec("php yourscript.php")
do consider using queuing system to store your php script names and worker to fetch data from queue and do the execution e.g. beanstalkd
You need to run them as detached jobs, and it is not really easy - or portable. The usual solution is to use nohup or exec the scripts with stdout and stderr redirected to /dev/null (or NUL in Windows), but this often has issues.
If possible, make the three scripts available as scripts on the web server, and access them through asynchronous cURL functions. This has also the advantage of being able to test the scripts through the browser, and supplying you the scripts output.
Other ways include using popen(), or if under Linux, the at or batch utility.
taken from http://board.phpbuilder.com/showthread.php?10351142-How-can-I-exec%28%29-in-a-non-blocking-fashion:
In order to execute a command have have it not hang your php script while it runs, the program you run must not output back to php. To do this, redirect both stdout and stderr to /dev/null, then background it.
> /dev/null 2>&1 &
In order to execute a command and have it spawned off as another process that is not dependent on the apache thread to keep running (will not die if somebody cancels the page) run this:
exec('bash -c "exec nohup setsid your_command > /dev/null 2>&1 &"');
For windows http://www.php.net/manual/en/function.exec.php:
function execInBackground($path, $exe, $args = "") {
global $conf;
if (file_exists($path . $exe)) {
chdir($path);
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start \"bla\" \"" . $exe . "\" " . escapeshellarg($args), "r"));
} else {
exec("./" . $exe . " " . escapeshellarg($args) . " > /dev/null &");
}
}
To the shell_exec, add the '/B' parameter, this allows you to run several executables at once.
See my answer at this question: PHP on a windows machine; Start process in background
It's the same.
shell_exec('start /B "C:\Path\to\program.exe"); /B parameter is key
here. I tried to find the topic for you again, but I can't seem to
find it anymore. This works for me.
I hope this solves the problem for you.

php exec command (or similar) to not wait for result

I have a command I want to run, but I do not want PHP to sit and wait for the result.
<?php
echo "Starting Script";
exec('run_baby_run');
echo "Thanks, Script is running in background";
?>
Is it possible to have PHP not wait for the result.. i.e. just kick it off and move along to the next command.
I cant find anything, and not sure its even possible. The best I could find was someone making a CRON job to start in a minute.
From the documentation:
In order to execute a command and have it not hang your PHP script while
it runs, the program you run must not output back to PHP. To do this,
redirect both stdout and stderr to /dev/null, then background it.
> /dev/null 2>&1 &
In order to execute a command and have
it spawned off as another process that
is not dependent on the Apache thread
to keep running (will not die if
somebody cancels the page) run this:
exec('bash -c "exec nohup setsid your_command > /dev/null 2>&1 &"');
You can run the command in the background by adding a & at the end of it as:
exec('run_baby_run &');
But doing this alone will hang your script because:
If a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can redirect the stdout of the command to a file, if you want to see it later or to /dev/null if you want to discard it as:
exec('run_baby_run > /dev/null &');
This uses wget to notify a URL of something without waiting.
$command = 'wget -qO- http://test.com/data=data';
exec('nohup ' . $command . ' >> /dev/null 2>&1 & echo $!', $pid);
This uses ls to update a log without waiting.
$command = 'ls -la > content.log';
exec('nohup ' . $command . ' >> /dev/null 2>&1 & echo $!', $pid);
I know this question has been answered but the answers i found here didn't work for my scenario ( or for Windows ).
I am using windows 10 laptop with PHP 7.2 in Xampp v3.2.4.
$command = 'php Cron.php send_email "'. $id .'"';
if ( substr(php_uname(), 0, 7) == "Windows" )
{
//windows
pclose(popen("start /B " . $command . " 1> temp/update_log 2>&1 &", "r"));
}
else
{
//linux
shell_exec( $command . " > /dev/null 2>&1 &" );
}
This worked perfectly for me.
I hope it will help someone with windows. Cheers.
There are two possible ways to implement it.
The easiest way is direct result to dev/null
exec("run_baby_run > /dev/null 2>&1 &");
But in case you have any other operations to be performed you may consider ignore_user_abort
In this case the script will be running even after you close connection.
"exec nohup setsid your_command"
the nohup allows your_command to continue even though the process that launched may terminate first. If it does, the the SIGNUP signal will be sent to your_command causing it to terminate (unless it catches that signal and ignores it).
On Windows, you may use the COM object:
if(class_exists('COM')) {
$shell = new COM('WScript.Shell');
$shell->Run($cmd, 1, false);
}
else {
exec('nohup ' . $cmd . ' 2>&1 &');
}

Categories