How to make a non-blocking php exec call? - php

I need to echo text to a named pipe (FIFO) in Linux. Even though I'm running in background with '&' and redirecting all output to a /dev/null, the shell_exec call always blocks.
There are tons of answers to pretty much exactly this question all over the internet, and they all basically point to the following php manual section:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
And sure enough, when I try the non-blocking approach (of backgrounding and redirecting to /dev/null) with other commands like sleep, php successfully executes without hanging. But for the case of echo-ing to the FIFO, php hangs even though running the same command with bash produces no visible output and immediately returns to the shell.
In bash, I can run:
bash$ { echo yay > fifo & } &> /dev/null
bash$ cat fifo
yay
[1]+ Done echo yay > fifo
but when running the following php file with php echo.php:
<?php
shell_exec("{ echo yay > fifo & } &> /dev/null");
?>
it hangs, unless I first open fifo for reading.
So my question is, why is this blocking, but sleep isn't? In addition, I want to know what is happening behind the scenes: when I put the '&' in the php call, even though the shell_exec call blocks, the echo call clearly doesn't block whatever bash session php invoked it on, because when I CTRL+C out of php, I can read 'yay' from the FIFO (if I don't background the echo command, after CTRL+C the FIFO contains no text). This suggests that perhaps php is waiting on the pid of the echo command before going to the next instruction. Is this true?

I've been trying something similar and in the end came up with this solution:
/**
* This runs a shell command on the server under the current PHP user, that is in CLI mode it is the user you are logged in with.
* If a command is run in the background the method will return the location of the tempfile that captures the output. In that case you will have to manually remove the temporary file.
*/
static public function command($cmd, $show_output = true, $escape_command = false, $run_in_background = false)
{
if ($escape_command)
$cmd = escapeshellcmd($cmd);
$f = trim(`mktemp`);
passthru($cmd . ($show_output ? " | tee $f" : " > $f") . ($run_in_background ? ' &' : ''));
return $run_in_background ? $f : trim(`cat $f ; rm -rf $f`);
}
The trick is to write the output to a temporary file and return that when the command has finished (blocking behavior) or just return the file path (non-blocking behavior). Also, I'm using passthru rather than shell_exec because interactive sessions are not possible with the latter because of the blocking behavior.

Related

Trying to run a shell script from php in the background

I have a PHP script, which accesses a database table, which contain information about some large files which need to be downloaded. I intend to run this PHP script as a cron job, in an hourly basis, and each time it runs, it should do the following things:
Check if there are files need to be downloaded
If there are, execute a shell script, which emits a wget command, and starts the downloading of the file, in the background, and when ready, runs a second php script, which updates the db tables of the completion of the download, and get back the process id of this shell script, for later use
Check if there are files currently being downloaded, if there are, check their process id is still active, and if not, adjust the table so we know that an error occured somewhere in the download
The shell script works accordingly, if I run it from the console, everything works fine, I am getting back the process id of the shell script also in my php file, my problem is, that when the originating php file exists, the shell script it initiated stops also.
Here's the code I use in php to start the shell script:
function runProcess($command, &$output=array()) {
$command = $command . ' > /dev/null 2>&1 & echo $!';
echo $command . "<BR>";
return exec($command, $output);
}
/** excerpt from the class that does the processing */
$pid=runProcess("sh ".self::DOWNLOAD_FILE_SHELL." ".DEFAULT_DIR_WHOME." 1.xml ".$this->Parent->XMLPath, $output));
echo $pid;
My question is as follows: How can I force the shell script to continue running, even when the parent process (the php script) exits?
If you're running on UNIX like environment, then the simplest way is to add ampersand (&) sign at the end of command line. this will tell the shell to run the command and not wait the completion. try this:
$command = $command . ' > /dev/null 2>&1 & echo $! &';

how to use php exec(), to run another script and run in the backround, and not wait for the script to finish

I am wanting to execute a large, database intensive script, but do not need to wait for the process to finish. I would simply like to call the script, let it run in the background and then redirect to another page.
EDIT:
i am working on a local Zend community server, on Windows 7.
I have access to remote linux servers where the project also resides, so i can do this on linux or windows.
i have this
public function createInstanceAction()
{
//calls a seperate php process which creates the instance
exec('php -f /path/to/file/createInstance.php');
Mage::getSingleton('adminhtml/session')->addSuccess(Mage::helper('adminhtml')->__('Instance creation process started. This may take up to a few minutes.'));
$this->_redirect('instances/adminhtml_instances/');
return;
}
this works perfectly, but the magento application hangs around for the process to finish. it does everything i expect, logging to file from time to time, and am happy with how its running. Now all i would like to do is have this script start, the controller action does not hang around, but instead redirects and thats that. from what I have learnt about exec(), you can do so by changing the way i call exec() above, to :
exec('php -f /path/to/file/createInstance.php > /dev/null 2>&1 &');
which i took from here
if i add "> /dev/null 2>&1 &" to the exec call, it doesnt wait around as expected, but it does not execute the script anymore. Could someone tell me why, and if so, tell me how i can get this to work please?
Could this be a permission related issue?
thanks
EDIT : Im assuming it would be an issue to have any output logged to file if i call the exec function with (/dev/null 2>&1 &) as that would cancel that. is that correct?
After taking time to fully understand my own question and the way it could be answered, i have prepared my solution.
Thanks to all for your suggestions, and for excusing my casual, unpreparedness when asking the question.
The answer to the above question depends on a number of things, such as the operating system you are referring to, which php modules you are running and even as far as what webserver you are running. So if i had to start the question again, the first thing i would do is state what my setup is.
I wanted to achieve this on two environments :
1.) Windows 7 running Zend server community edition.
2.) Linux (my OS is Linux odysseus 2.6.32-5-xen-amd64 #1 SMP Fri Sep 9 22:23:19 UTC 2011 x86_64)
to get this right, i wanted it to work either way when deploying to windows or linux, so i used php to determine what the operating system was.
public function createInstanceAction()
{
//determines what operating system is Being used
if (strtoupper(substr(PHP_OS, 0, 3)) === 'WIN')
{
//This is a windows server
//call a seperate php process to run independently from the broswer action
pclose(popen("start php /path/to/script/script.php","r"));
}
else
{
//assuming its linux, but in fact it simply means its not windows
// to check for linux specifically use (strtoupper(substr(PHP_OS, 0, 3)) === 'LIN')
exec('php -f /path/to/file/script.php >/dev/null 2>&1 &');
}
//the browser will not hang around for this process to complete, and you can contimue with whatever actions you want.
//myscript log any out put so i can capture info as it runs
}
In short, ask questions once you understand them. there are many ways to to achieve the above, and this is just one solution that works for my development and production environments.
thanks for the help all.
PHP popen
From the docs (this should help you do other stuff, while that process is working; not sure if closing the current PHP process will kill the opened process):
/* Add redirection so we can get stderr. */
$handle = popen('/path/to/executable 2>&1', 'r');
echo "'$handle'; " . gettype($handle) . "\n";
$read = fread($handle, 2096);
echo $read;
pclose($handle);
Solution 2:
Trick the browser to close the connection (assuming there is a browser involved):
ob_start();
?><html><!--example html body--></html><?php
$strContents=ob_get_clean();
header("Connection: Close");
header("Content-encoding: none");//doesn't work without this, I don't know why:(
ignore_user_abort(true);
header("Content-type: text/html");
header("Content-Length: ".strlen($strContents));
echo $strContents;
flush();
//at this point a real browser would close the connection and finish rendering;
//crappy http clients like some curl implementations (and not only) would wait for the server to close the connection, then finish rendering/serving results...:(
//TODO: add long running operations here, exec, or whatever you have.
You could write a wrapper-script, say createInstance.sh like
#! /bin/bash
trap "" SIGHUP
php -f "$1" > logfile.txt 2>&1 &
Then you call the script from within PHP:
exec('bash "/path/to/file/createInstance.sh"');
which should detach the new php process most instantly from the script. If that doesen't help, you might try to use SIGABRT, SIGTERM or SIGINT instead of SIGHUP, I don't know exactly which signal is sent.
I've been able to use:
shell_exec("nohup $command > /dev/null & echo $!")
Where $command is for example:
php script.php --parameter 1
I've noticed some strange behavior with this. For example running mysql command line doesn't work, only php scripts seem to work.
Also, running cd /path/to/dir && php nohup $command ... doesn't work either, I had to chdir() within the PHP script and then run the command for it to work.
The PHP executable included with Zend Server seems to be what's causing attempts to run a script in the background (using the ampersand & operator in the exec) to fail.
We tested this using our standard PHP executable and it worked fine. It's something to do with the version shipped with Zend Server though our limited attempts to figure out what that was going on have not turned anything up.

Starting a daemon from PHP

For a website, I need to be able to start and stop a daemon process. What I am currently doing is
exec("sudo /etc/init.d/daemonToStart start");
The daemon process is started, but Apache/PHP hangs. Doing a ps aux revealed that sudo itself changed into a zombie process, effectively killing all further progress. Is this normal behavior when trying to start a daeomon from PHP?
And yes, Apache has the right to execute the /etc/init.d/daemonToStart command. I altered the /etc/sudoers file to allow it to do so. No, I have not allowed Apache to be able to execute any kind of command, just a limited few to allow the website to work.
Anyway, going back to my question, is there a way to allow PHP to start daemons in a way that no zombie process is created? I ask this because when I do the reverse, stopping an already started daemon, works just fine.
Try appending > /dev/null 2>&1 & to the command.
So this:
exec("sudo /etc/init.d/daemonToStart > /dev/null 2>&1 &");
Just in case you want to know what it does/why:
> /dev/null - redirect STDOUT to /dev/null (blackhole it, in other words)
2>&1 - redirect STDERR to STDOUT (blackhole it as well)
& detach process and run in the background
I had the same problem.
I agree with DaveRandom, you have to suppress every output (stdout and stderr). But no need to launch in another process with the ending '&': the exec() function can't check the return code anymore, and returns ok even if there is an error...
And I prefer to store outputs in a temporary file, instead of 'blackhole'it.
Working solution:
$temp = tempnam(sys_get_temp_dir(), 'php');
exec('sudo /etc/init.d/daemonToStart >'.$temp.' 2>&1');
Just read file content after, and delete temporary file:
$output = explode("\n", file_get_contents($temp));
#unlink($temp);
I have never tried starting a daemon from PHP, but I have tried running other shell commands, with much trouble. Here are a few things I have tried, in the past:
As per DaveRandom's answer, append /dev/null 2>&1 & to the end of your command. This will redirect errors to standard output. You can then use this output to debug.
Make sure your webserver's user's PATH contains all referenced binaries inside your daemon script. You can do this by calling exec('echo $PATH; whoami;). This will tell you the user PHP is running under, and it's current PATH variable.

php system() shell_exec() hangs the browser [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Asynchronous shell exec in PHP
i need to run a java program in the background.
process.php contains
shell_exec("php php_cli.php")
php_cli.php contains
shell_exec("java -jar BiForce.jar settings.ini > log.txt");
I am calling process.php asynchronously using ajax
When i click the link in the webpage that calls ajax function (for running process.php) the webage shows "loading". when i click other links at the same time it does not responds.
The java program takes about 24 hours to finish executing, so user will not wait until the execution ends.
The problem is that the browser keeps on loading and does not go to other pages when clicked the link.
I also tried with system(), but the same problem ....
Help will greatly be appreciated.
Using shell_exec waits for the command to hang up, so that's what your script is doing.
If your command doesn't have any wait time, then your script will not either.
You can call another PHP script from your original, without waiting for it to hang up:
$processId = shell_exec(
"nohup " . // Runs a command, ignoring hangup signals.
"nice " . // "Adjusted niceness" :) Read nice --help
"/usr/bin/php -c " . // Path to your PHP executable.
"/path/to/php.ini -f " . // Path to your PHP config.
"/var/www/php_cli.php " . // Path to the script you want to execute.
"action=generate > /process.log " . // Log file.
"& echo $!" // Make sure it returns only the process id.
);
It is then possible to detect whether or not the script is finished by using this command:
exec('ps ' . $processId, $processState);
// exec returns the result of the command - but we need to store the process state.
// The third param is a referenced variable.
// First key in $processState is that it's running.
// Second key would be that it has exited.
if (count($processState) < 2) {
// Process has ended.
}
You could call the command in the page displayed, but appending an & at the end:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
This way the process is launched on the background.
Also, there is no need (unless defined by your application) to create a process.php wich itself calls php via a shell exec. You could archive the same functionality via an include to the other file.
As in normal shell scripting you can use the ampersand to background the process:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
See Asynchronous shell exec in PHP .
First, you might want to redesign this concept. I am not sure exactly what these programs do, but clearly this is can lead to potential problems...
This is what I suggest you do, instead of starting external processes via PHP:
Your ajax call creates (or reuse) a file in some temporary directory (probably using the user session to generate that file)
some data is written unto the file, and the request ends
Your jar is launched separately, and runs indefinitely
At regular intervals, the Java program scans the temporary directory for new files, or if some file has been modified
parse it, and execute the 24 hour long process, or adjust any previous execution if necessary
Along the same idea, you can even use sockets instead to communicate with that Java program, or any other way.
The advantage of having the Java program running all the time instead of starting a new process is to be able to reuse system resources within the lifetime of the application; for example, if your program is using DB connections, or any data, cache, etc.

PHP on a windows machine; Start process in background

I'm looking for the best, or any way really to start a process from php in the background so I can kill it later in the script.
Right now, I'm using: shell_exec($Command);
The problem with this is it waits for the program to close.
I want something that will have the same effect as nohup when I execute the shell command. This will allow me to run the process in the background, so that later in the script it can be closed. I need to close it because this script will run on a regular basis and the program can't be open when this runs.
I've thought of generating a .bat file to run the command in the background, but even then, how do I kill the process later?
The code I've seen for linux is:
$PID = shell_exec("nohup $Command > /dev/null & echo $!");
// Later on to kill it
exec("kill -KILL $PID");
EDIT: Turns out I don't need to kill the process
shell_exec('start /B "C:\Path\to\program.exe"');
The /B parameter is key here.
I can't seem to find where I found this anymore. But this works for me.
Will this function from the PHP Manual help?
function runAsynchronously($path,$arguments) {
$WshShell = new COM("WScript.Shell");
$oShellLink = $WshShell->CreateShortcut("temp.lnk");
$oShellLink->TargetPath = $path;
$oShellLink->Arguments = $arguments;
$oShellLink->WorkingDirectory = dirname($path);
$oShellLink->WindowStyle = 1;
$oShellLink->Save();
$oExec = $WshShell->Run("temp.lnk", 7, false);
unset($WshShell,$oShellLink,$oExec);
unlink("temp.lnk");
}
Tried to achieve the same on a Windows 2000 server with PHP 5.2.8.
None of the solutions worked for me. PHP kept waiting for the response.
Found the solution to be :
$cmd = "E:\PHP_folder_path\php.exe E:\some_folder_path\backgroundProcess.php";
pclose(popen("start /B ". $cmd, "a")); // mode = "a" since I had some logs to edit
From the php manual for exec:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
ie pipe the output into a file and php won't wait for it:
exec('myprog > output.txt');
From memory, I believe there is a control character that you can prepend (like you do with #) to the exec family of commands that also prevents execution from pausing - can't remember what it is though.
Edit Found it! On unix, programs executed with & prepended will run in the background. Sorry, doesn't help you much.
On my Windows 10 and Windows Server 2012 machines, the only solution that worked reliably within pclose/popen was to invoke powershell's Start-Process command, as in:
pclose(popen('powershell.exe "Start-Process foo.bat -WindowStyle Hidden"','r'));
Or more verbosely if you want to supply arguments and redirect outputs:
pclose(popen('powershell.exe "Start-Process foo.bat
-ArgumentList \'bar\',\'bat\'
-WindowStyle Hidden
-RedirectStandardOutput \'.\\console.out\'
-RedirectStandardError \'.\\console.err\'"','r'));

Categories