I have a PHP script, which accesses a database table, which contain information about some large files which need to be downloaded. I intend to run this PHP script as a cron job, in an hourly basis, and each time it runs, it should do the following things:
Check if there are files need to be downloaded
If there are, execute a shell script, which emits a wget command, and starts the downloading of the file, in the background, and when ready, runs a second php script, which updates the db tables of the completion of the download, and get back the process id of this shell script, for later use
Check if there are files currently being downloaded, if there are, check their process id is still active, and if not, adjust the table so we know that an error occured somewhere in the download
The shell script works accordingly, if I run it from the console, everything works fine, I am getting back the process id of the shell script also in my php file, my problem is, that when the originating php file exists, the shell script it initiated stops also.
Here's the code I use in php to start the shell script:
function runProcess($command, &$output=array()) {
$command = $command . ' > /dev/null 2>&1 & echo $!';
echo $command . "<BR>";
return exec($command, $output);
}
/** excerpt from the class that does the processing */
$pid=runProcess("sh ".self::DOWNLOAD_FILE_SHELL." ".DEFAULT_DIR_WHOME." 1.xml ".$this->Parent->XMLPath, $output));
echo $pid;
My question is as follows: How can I force the shell script to continue running, even when the parent process (the php script) exits?
If you're running on UNIX like environment, then the simplest way is to add ampersand (&) sign at the end of command line. this will tell the shell to run the command and not wait the completion. try this:
$command = $command . ' > /dev/null 2>&1 & echo $! &';
Related
I need to echo text to a named pipe (FIFO) in Linux. Even though I'm running in background with '&' and redirecting all output to a /dev/null, the shell_exec call always blocks.
There are tons of answers to pretty much exactly this question all over the internet, and they all basically point to the following php manual section:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
And sure enough, when I try the non-blocking approach (of backgrounding and redirecting to /dev/null) with other commands like sleep, php successfully executes without hanging. But for the case of echo-ing to the FIFO, php hangs even though running the same command with bash produces no visible output and immediately returns to the shell.
In bash, I can run:
bash$ { echo yay > fifo & } &> /dev/null
bash$ cat fifo
yay
[1]+ Done echo yay > fifo
but when running the following php file with php echo.php:
<?php
shell_exec("{ echo yay > fifo & } &> /dev/null");
?>
it hangs, unless I first open fifo for reading.
So my question is, why is this blocking, but sleep isn't? In addition, I want to know what is happening behind the scenes: when I put the '&' in the php call, even though the shell_exec call blocks, the echo call clearly doesn't block whatever bash session php invoked it on, because when I CTRL+C out of php, I can read 'yay' from the FIFO (if I don't background the echo command, after CTRL+C the FIFO contains no text). This suggests that perhaps php is waiting on the pid of the echo command before going to the next instruction. Is this true?
I've been trying something similar and in the end came up with this solution:
/**
* This runs a shell command on the server under the current PHP user, that is in CLI mode it is the user you are logged in with.
* If a command is run in the background the method will return the location of the tempfile that captures the output. In that case you will have to manually remove the temporary file.
*/
static public function command($cmd, $show_output = true, $escape_command = false, $run_in_background = false)
{
if ($escape_command)
$cmd = escapeshellcmd($cmd);
$f = trim(`mktemp`);
passthru($cmd . ($show_output ? " | tee $f" : " > $f") . ($run_in_background ? ' &' : ''));
return $run_in_background ? $f : trim(`cat $f ; rm -rf $f`);
}
The trick is to write the output to a temporary file and return that when the command has finished (blocking behavior) or just return the file path (non-blocking behavior). Also, I'm using passthru rather than shell_exec because interactive sessions are not possible with the latter because of the blocking behavior.
I have the following set up: on Mac OS X, Apache invokes a PHP script, which uses the system function to call a Matlab script, which in turn has its own system invocation to call some shell commands.
However, it seems that no commands actually run (I tried a simple echo into a file), and when I try to capture the output of the command using the [status, cmdout] = system() signature, status and cmdout end up being empty.
This doesn't happen if I run the Matlab script manually from the command line (the echo and all other system calls run as normal).
Thanks in advance!
EDIT:
PHP code:
$system_call_string_2 = "matlab -nosplash -nodesktop -r 'run ../users/".$user."/projects/".$project."/processing.m' > /dev/null &";
system($system_call_string_2);
processing.m calls a function form a file called data_file_load_online.m:
system(['grep "CGHv1_Ca_\|CGH_Ca_" ' file_dir '/' file_name ' > ' file_dir '/CGH_rows.xls']);
To be clear, I've already checked that this code is reached, the path is correct, substituted the command for an echo into a file, and ran the Matlab script manually to make sure that it does work that way.
I'm working on a web application that needs to send a lot of HTTP requests and update the table, this will block the PHP from executing. So I though I might have to write a separate PHP script and run it via my main application. I tried Exec but still the program waits until the script is executed.
exec('php do_job.php');
I even tried redirecting the output to a file as PHP.Net suggests:
Note: If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
$result = exec('php do_job.php > output.txt &',$output);
But still no success ... Further down the same page I came accross this:
$command = 'php do_job.php';
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
Still no sucess ... Lastly I tried:
pclose(popen("start /B ". $command, "r"));
What am I doing wrong here?
I'm developing my app on localhost (XAMPP - Windows), later I'll be releasing it on a Linux host. My last resort would be to run the script via CRON jobs. Is this the only way?
By just adding & at the end of command you can run any command in background. Here I'm redirecting o/p to /dev/null to avoid hang(For Linux).
exec('php do_job.php > /dev/null &');
If you want see the o/p of the command you can redirect it to a file.
exec('php do_job.php > full_path_to_file &');
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Asynchronous shell exec in PHP
i need to run a java program in the background.
process.php contains
shell_exec("php php_cli.php")
php_cli.php contains
shell_exec("java -jar BiForce.jar settings.ini > log.txt");
I am calling process.php asynchronously using ajax
When i click the link in the webpage that calls ajax function (for running process.php) the webage shows "loading". when i click other links at the same time it does not responds.
The java program takes about 24 hours to finish executing, so user will not wait until the execution ends.
The problem is that the browser keeps on loading and does not go to other pages when clicked the link.
I also tried with system(), but the same problem ....
Help will greatly be appreciated.
Using shell_exec waits for the command to hang up, so that's what your script is doing.
If your command doesn't have any wait time, then your script will not either.
You can call another PHP script from your original, without waiting for it to hang up:
$processId = shell_exec(
"nohup " . // Runs a command, ignoring hangup signals.
"nice " . // "Adjusted niceness" :) Read nice --help
"/usr/bin/php -c " . // Path to your PHP executable.
"/path/to/php.ini -f " . // Path to your PHP config.
"/var/www/php_cli.php " . // Path to the script you want to execute.
"action=generate > /process.log " . // Log file.
"& echo $!" // Make sure it returns only the process id.
);
It is then possible to detect whether or not the script is finished by using this command:
exec('ps ' . $processId, $processState);
// exec returns the result of the command - but we need to store the process state.
// The third param is a referenced variable.
// First key in $processState is that it's running.
// Second key would be that it has exited.
if (count($processState) < 2) {
// Process has ended.
}
You could call the command in the page displayed, but appending an & at the end:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
This way the process is launched on the background.
Also, there is no need (unless defined by your application) to create a process.php wich itself calls php via a shell exec. You could archive the same functionality via an include to the other file.
As in normal shell scripting you can use the ampersand to background the process:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
See Asynchronous shell exec in PHP .
First, you might want to redesign this concept. I am not sure exactly what these programs do, but clearly this is can lead to potential problems...
This is what I suggest you do, instead of starting external processes via PHP:
Your ajax call creates (or reuse) a file in some temporary directory (probably using the user session to generate that file)
some data is written unto the file, and the request ends
Your jar is launched separately, and runs indefinitely
At regular intervals, the Java program scans the temporary directory for new files, or if some file has been modified
parse it, and execute the 24 hour long process, or adjust any previous execution if necessary
Along the same idea, you can even use sockets instead to communicate with that Java program, or any other way.
The advantage of having the Java program running all the time instead of starting a new process is to be able to reuse system resources within the lifetime of the application; for example, if your program is using DB connections, or any data, cache, etc.
I am trying to manage a queue of files waiting to be processed by ffmpeg. A page is run using CRON that runs through a database of files waiting to be processed. The page then builds the commands and sends them to the command line using exec().
However, when the PHP page is run from the command line or CRON, it runs the exec() OK, but does not return to the PHP page to continue updating the database and other functions.
Example:
<?php
$cmd = "ffmpeg inpupt.mpg output.m4v";
exec($cmd . ' 2>&1', $output, $return);
//Page continues...but not executed
$update = mysql_query("UPDATE.....");
?>
When this page is run from the command line, the command is run using exec() but then the rest of the page is not executed. I think the problem may be that I am running a command using exec() in a page run from the command line.
Is it possible to run a PHP page in full from the command line which includes exec()?
Or is there a better way of doing this?
Thank you.
I wrote an article about Running a Background Process from PHP on Linux some time ago:
<?php system( 'sh test.sh >/dev/null &' ); ?>
Notice the & operator at the end. This starts a process that returns control to the shell immediately AND CONTINUES TO RUN in the background.
More examples:
<!--
saving standard output to a file
very important when your process runs in background
as this is the only way the process can error/success
-->
<?php system( 'sh test.sh >test-out.txt &' ); ?>
<!--
saving standard output and standard error to files
same as above, most programs log errors to standard error hence its better to capture both
-->
<?php system( 'sh test.sh >test-out.txt 2>test-err.txt &' ); ?>
Have you tried using CURL instead?
Unsure but probably thats due to the shell constraints of cron processes if it works as a web page then use it as a web page, setup a cron job that calls wget wherever_your_page_is and it will be called via your web server and should mimic your tests.