This question already has answers here:
php execute a background process
(21 answers)
Closed 9 years ago.
If I run this unix command directly in shell:
$ sleep 100 &
sleep runs in the background as expected, and I can continue working in the command line.
but trying the same thing with shell_exec() and php I get different results.
<?php
$sleep = $argv[1];
$shell="sleep " . $sleep . " &";
shell_exec($shell);
?>
when executing php sleep.php 100 the command line hangs and wont accept any more commands until sleep finishes. I am not sure whether this is a nuance I am missing with shell_exec() / $argv of php or with the unix shell.
Thanks.
The shell_exec function is trying to capture the output of the command, which it can't do while simultaneously continuing processing. In fact, if you look at the php source code, the php shell_exec function does a popen C call, which does a wait syscall on the command. wait guarantees that the subprocess doesn't return until the child has exited.
Related
This question already has answers here:
Killing processes opened with popen()?
(3 answers)
Closed 1 year ago.
Do I have a chance to find the pid of the application I'm running with popen?
I know it's possible with proc_open but it's unlikely for me to change my app's structure
Or how can I stop the process that is opened with popen and continues to run?
When encoding stream with ffmpeg sometimes I need to stop
function popen:
https://www.php.net/manual/en/function.popen.php
function pclose:
https://www.php.net/manual/en/function.pclose.php
Where pclose will return -1 if it could not close the process.
If pclose cannot close the process run popen & use a system command to kill the process by getting the parent pid of the forked process:
Killing processes opened with popen()?
Very last answer may work, otherwise use proc_open:
https://www.php.net/manual/en/function.proc-open.php
This question already has answers here:
Running PHP script from command line as background process
(4 answers)
Closed 2 years ago.
From a php script I need to launch a new php script in background.
I expect this to be working:
shell_exec("php mySecondScript.php &");
Execution of main script hangs and secondScript is not even started. Of course if I remove '&' the script is executed, but synchronously. Any reason for this?
You don't redirect output of command.
shell_exec("php mySecondScript.php > /dev/null &");
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Make PHP wait for Matlab script to finish executing
Okay, starting from php execute a background process
to run a background process works great. The problem is, I need the return of that process also. The obvious solution to me is :
$cmd = "($cmd > $outputfile 2>&1 || echo $? > $returnfile) & echo $! > $pidfile";
exec($cmd);
When I run the generated command on the command line, it backgrounds and the files are filled out as expected. The problem is that when php exec() runs, the command doesn't go to the background (at least, exec doesn't return until the command finishes). I tried variations with nohup and wait $pid, but still no solution.
Any thoughts?
This is tricky- you could potentially fork the process to do something else, leaving the original process in place.
http://php.net/manual/en/function.pcntl-fork.php
However, if this is a web application, there's no built-in way to retrieve the return code or STDOUT back into the parent process since it's technically async (your request-response cycle will likely end before a result can be produced).
You could store the return code and / or STDOUT to files to check later, though.
This question already has answers here:
php exec command (or similar) to not wait for result
(7 answers)
Closed 7 years ago.
I have a PHP script that queries a database for a list of jobs to be done and fires off other PHP scripts based on what it finds in the database (basically a process queue).
Some of the scripts that the queue runner script executes may take 30 seconds or so to finish running (generating PDFs, resizing images, etc).
The problem is that shell_exec() in the queue runner script calls the processing scripts, but then doesn't wait for them to finish, resulting in the queue not being completed.
Queue runner script:
#!/usr/bin/php
<?php
// Loop through database and find jobs to be done
shell_exec(sprintf("/root/scripts/%s.php", $row['jobName']));
?>
Job script:
#!/usr/bin/php
<?php
shell_exec("/usr/bin/htmldoc -t pdf --webpage test.html > test.pdf");
// Update database to mark job as completed
?>
Running the job script directly from the command line works and the PDF is created.
Any ideas on how to fix this? Or a better way to run a process queue?
Try this:
shell_exec("nohup /usr/bin/htmldoc -t pdf --webpage test.html > test.pdf 2>&1 &");
I'm currently running these lines in a PHP script:
foreach ($list as $route)
{
exec('php-cgi ./nextbus-route_stop_predictions.php route=' . $route['route_id']);
}
nextbus-route_stop_predictions.php takes about 15 seconds to run and exec() is ran about 12 times. I was wondering if PHP was able to run these those command lines without needing to wait for the output of the previous exec(). So basically, I want to run this asynchronously/multi-processes.
Update
For anyone still looking for the answer, I used nohup and piped the output to /dev/null
pcntl_fork() may be able to help if you're on *nix.
The pcntl_fork() function creates a child process that differs from the parent process only in its PID and PPID. Please see your system's fork(2) man page for specific details as to how fork works on your system.
If you are running php as an Apache module you can do the trick mentioned here
http://joseph.randomnetworks.com/archives/2005/10/21/fake-fork-in-php/
exec("php-cgi ./nextbus-route_stop_predictions.php route=" . $route['route_id'] . " 2>/dev/null >&- < &- >/dev/null &");
Basically rerouting stderr,stdout and sending the process into the background.