PHP - exec() hang the web application until the command finished [duplicate] - php

I've got a PHP script that needs to invoke a shell script but doesn't care at all about the output. The shell script makes a number of SOAP calls and is slow to complete, so I don't want to slow down the PHP request while it waits for a reply. In fact, the PHP request should be able to exit without terminating the shell process.
I've looked into the various exec(), shell_exec(), pcntl_fork(), etc. functions, but none of them seem to offer exactly what I want. (Or, if they do, it's not clear to me how.) Any suggestions?

If it "doesn't care about the output", couldn't the exec to the script be called with the & to background the process?
EDIT - incorporating what #AdamTheHut commented to this post, you can add this to a call to exec:
" > /dev/null 2>/dev/null &"
That will redirect both stdio (first >) and stderr (2>) to /dev/null and run in the background.
There are other ways to do the same thing, but this is the simplest to read.
An alternative to the above double-redirect:
" &> /dev/null &"

I used at for this, as it is really starting an independent process.
<?php
`echo "the command"|at now`;
?>

To all Windows users: I found a good way to run an asynchronous PHP script (actually it works with almost everything).
It's based on popen() and pclose() commands. And works well both on Windows and Unix.
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
Original code from: http://php.net/manual/en/function.exec.php#86329

On linux you can do the following:
$cmd = 'nohup nice -n 10 php -f php/file.php > log/file.log & printf "%u" $!';
$pid = shell_exec($cmd);
This will execute the command at the command prompty and then just return the PID, which you can check for > 0 to ensure it worked.
This question is similar: Does PHP have threading?

php-execute-a-background-process has some good suggestions. I think mine is pretty good, but I'm biased :)

In Linux, you can start a process in a new independent thread by appending an ampersand at the end of the command
mycommand -someparam somevalue &
In Windows, you can use the "start" DOS command
start mycommand -someparam somevalue

the right way(!) to do it is to
fork()
setsid()
execve()
fork forks, setsid tell the current process to become a master one (no parent), execve tell the calling process to be replaced by the called one. so that the parent can quit without affecting the child.
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
// parent's stuff
exit();

I used this...
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
* Relies on the PHP_PATH config constant.
*
* #param string $filename file to execute
* #param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec(PHP_PATH . " -f {$filename} {$options} >> /dev/null &");
}
(where PHP_PATH is a const defined like define('PHP_PATH', '/opt/bin/php5') or similar)
It passes in arguments via the command line. To read them in PHP, see argv.

I also found Symfony Process Component useful for this.
use Symfony\Component\Process\Process;
$process = new Process('ls -lsa');
// ... run process in background
$process->start();
// ... do other things
// ... if you need to wait
$process->wait();
// ... do things after the process has finished
See how it works in its GitHub repo.

The only way that I found that truly worked for me was:
shell_exec('./myscript.php | at now & disown')

You can also run the PHP script as daemon or cronjob: #!/usr/bin/php -q

Use a named fifo.
#!/bin/sh
mkfifo trigger
while true; do
read < trigger
long_running_task
done
Then whenever you want to start the long running task, simply write a newline (nonblocking to the trigger file.
As long as your input is smaller than PIPE_BUF and it's a single write() operation, you can write arguments into the fifo and have them show up as $REPLY in the script.

without use queue, you can use the proc_open() like this:
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w") //here curaengine log all the info into stderror
);
$command = 'ping stackoverflow.com';
$process = proc_open($command, $descriptorspec, $pipes);

I can not use > /dev/null 2>/dev/null & on Windows, so I use proc_open instead. I run PHP 7.4.23 on Windows 11.
This is my code.
function run_php_async($value, $is_windows)
{
if($is_windows)
{
$command = 'php -q '.$value." ";
echo 'COMMAND '.$command."\r\n";
proc_open($command, [], $pipe);
}
else
{
$command = 'php -q '.$value." > /dev/null 2>/dev/null &";
echo 'COMMAND '.$command."\r\n";
shell_exec($command);
}
}
$tasks = array();
$tasks[] = 'f1.php';
$tasks[] = 'f2.php';
$tasks[] = 'f3.php';
$tasks[] = 'f4.php';
$tasks[] = 'f5.php';
$tasks[] = 'f6.php';
$is_windows = true;
foreach($tasks as $key=>$value)
{
run_php_async($value, $is_windows);
echo 'STARTED AT '.date('H:i:s')."\r\n";
}
In each files to be execute, I put delay this:
<?php
sleep(mt_rand(1, 10));
file_put_contents(__FILE__.".txt", time());
All files are executed asynchronously.

Related

PHP: How to send ssh command and not wait until it is executed?

I have following function:
public function update($id){
$data = $this->client_model->get_client_by_id($id);
$sshport = $data['ssh_port'];
$sshcommand = '/var/script/config.sh';
$this->sshcommand($sshport, $sshcommand);
$this->session->set_flashdata('msg', 'Config has been sent');
redirect(base_url('admin/edit/'.$id)) }
The sshcommand function looks like this:
private function sshcommand($port, $command) {
$remotecommand = 'ssh -q root#localhost -o "StrictHostKeyChecking=no" -p'.$port.' "'.$command.'" 2> /dev/null';
$connection = ssh2_connect('controller.server.example.org', 22);
ssh2_auth_pubkey_file($connection, 'root','/root/.ssh/id_rsa.pub','/root/.ssh/id_rsa');
ssh2_exec($connection, $remotecommand); }
My problem is that the first update function wait till /var/script/config.sh has finished.
But in some of my cases it takes very long, so I just want to sent the command and let it work in the background.
I tried to change it to /var/script/config.sh | at now + 1 minute but its the same result..
Any Ideas?
Try using & with your command:
$sshcommand = '/var/script/config.sh &';
bash man page says:
If a command is terminated by the control operator &, the shell executes the command in the background in a subshell. The shell does not wait for the command to finish, and the return status is 0.
Ensure shell you are using for user that is being used by ssh, supports that.
Alternatively you can try with nohup:
$sshcommand = 'nohup /var/script/config.sh';
This may also be shell dependant. bash works. Not sure about i.e. plain sh though.

PHP executes bash script, redirect output to database

I have queue of tasks that each can/will take a while. The task is ran using a bash script and a couple of parameters. When the user fires this task, he should be able to monitor the status (and thus the output) of this task. For this, the output should be stored in a database and the user interface fetches the current output.
How can I best run fetch.sh so each line of output is inserted into the database, and not only when the script finishes?
Would something like this work?
shell_exec("./script.sh | while read -r line ; do
sqlite "insert into history (str) values ("$1");
done &> /dev/null &")
Perhaps something like this would work for you:
$db = new SQLite3('mydb');
if( ($fp = popen("./script.sh", "r")) ) {
while( !feof($fp) ){
$line = fread($fp, 1024);
$db->exec("insert into history (str) values ({$line})");
}
fclose($fp);
}
UPDATE
As it turns out - there is a relatively easy way to run this in a background
Say, our script is called logger.php
In that case - we could do:
nohup php -q logger.php > script.log 2>&1 &

running shell_exec in php causes web server to hang

I am running the following code. What it does is take a text file, splits it into parts that end with '_part' ending and than calls the same script with a flag to process the files - uploading the content to a Drupal system.
What happens is that the script runs and finishes the work, all invoked scripts finish too and I can see the results. but each time after I run it the web server stops responding. Is there anything basic that I am missing or doing wrong?
if(isset($argv[3])){
$isSplit = $argv[3] == 'true' ? true : false;
}
if($isSplit){
$fileSplitter = new CSVFileParts($fileName);
$parts = $fileSplitter->split_file();
echo 'Splited file to '.$parts.' parts'.PHP_EOL;
for($part =0; $part < $parts; $part++){
echo shell_exec('php Service.php u ./partial_files/'.basename($fileName).'.part_'.$part.' false > /dev/null 2>/dev/null &');
}
}else{
$log->lwrite('uploading '.$argv[2]);
$drupalUploader = new DrupalUploader($fileName, $log);
$drupalUploader->upload();
}
shell_exec — Execute command via shell and return the complete output as a string
shell_exec expects the file handle to be open, but you redirect everything to /dev/null and detach it.
As you plan to detach the process and remove all the output, you should use exec() and escapeshellcmd()
see: http://www.php.net/manual/en/function.exec.php

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Asynchronous shell exec in PHP

I've got a PHP script that needs to invoke a shell script but doesn't care at all about the output. The shell script makes a number of SOAP calls and is slow to complete, so I don't want to slow down the PHP request while it waits for a reply. In fact, the PHP request should be able to exit without terminating the shell process.
I've looked into the various exec(), shell_exec(), pcntl_fork(), etc. functions, but none of them seem to offer exactly what I want. (Or, if they do, it's not clear to me how.) Any suggestions?
If it "doesn't care about the output", couldn't the exec to the script be called with the & to background the process?
EDIT - incorporating what #AdamTheHut commented to this post, you can add this to a call to exec:
" > /dev/null 2>/dev/null &"
That will redirect both stdio (first >) and stderr (2>) to /dev/null and run in the background.
There are other ways to do the same thing, but this is the simplest to read.
An alternative to the above double-redirect:
" &> /dev/null &"
I used at for this, as it is really starting an independent process.
<?php
`echo "the command"|at now`;
?>
To all Windows users: I found a good way to run an asynchronous PHP script (actually it works with almost everything).
It's based on popen() and pclose() commands. And works well both on Windows and Unix.
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
Original code from: http://php.net/manual/en/function.exec.php#86329
On linux you can do the following:
$cmd = 'nohup nice -n 10 php -f php/file.php > log/file.log & printf "%u" $!';
$pid = shell_exec($cmd);
This will execute the command at the command prompty and then just return the PID, which you can check for > 0 to ensure it worked.
This question is similar: Does PHP have threading?
php-execute-a-background-process has some good suggestions. I think mine is pretty good, but I'm biased :)
In Linux, you can start a process in a new independent thread by appending an ampersand at the end of the command
mycommand -someparam somevalue &
In Windows, you can use the "start" DOS command
start mycommand -someparam somevalue
the right way(!) to do it is to
fork()
setsid()
execve()
fork forks, setsid tell the current process to become a master one (no parent), execve tell the calling process to be replaced by the called one. so that the parent can quit without affecting the child.
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
// parent's stuff
exit();
I used this...
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
* Relies on the PHP_PATH config constant.
*
* #param string $filename file to execute
* #param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec(PHP_PATH . " -f {$filename} {$options} >> /dev/null &");
}
(where PHP_PATH is a const defined like define('PHP_PATH', '/opt/bin/php5') or similar)
It passes in arguments via the command line. To read them in PHP, see argv.
I also found Symfony Process Component useful for this.
use Symfony\Component\Process\Process;
$process = new Process('ls -lsa');
// ... run process in background
$process->start();
// ... do other things
// ... if you need to wait
$process->wait();
// ... do things after the process has finished
See how it works in its GitHub repo.
The only way that I found that truly worked for me was:
shell_exec('./myscript.php | at now & disown')
You can also run the PHP script as daemon or cronjob: #!/usr/bin/php -q
Use a named fifo.
#!/bin/sh
mkfifo trigger
while true; do
read < trigger
long_running_task
done
Then whenever you want to start the long running task, simply write a newline (nonblocking to the trigger file.
As long as your input is smaller than PIPE_BUF and it's a single write() operation, you can write arguments into the fifo and have them show up as $REPLY in the script.
without use queue, you can use the proc_open() like this:
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w") //here curaengine log all the info into stderror
);
$command = 'ping stackoverflow.com';
$process = proc_open($command, $descriptorspec, $pipes);
I can not use > /dev/null 2>/dev/null & on Windows, so I use proc_open instead. I run PHP 7.4.23 on Windows 11.
This is my code.
function run_php_async($value, $is_windows)
{
if($is_windows)
{
$command = 'php -q '.$value." ";
echo 'COMMAND '.$command."\r\n";
proc_open($command, [], $pipe);
}
else
{
$command = 'php -q '.$value." > /dev/null 2>/dev/null &";
echo 'COMMAND '.$command."\r\n";
shell_exec($command);
}
}
$tasks = array();
$tasks[] = 'f1.php';
$tasks[] = 'f2.php';
$tasks[] = 'f3.php';
$tasks[] = 'f4.php';
$tasks[] = 'f5.php';
$tasks[] = 'f6.php';
$is_windows = true;
foreach($tasks as $key=>$value)
{
run_php_async($value, $is_windows);
echo 'STARTED AT '.date('H:i:s')."\r\n";
}
In each files to be execute, I put delay this:
<?php
sleep(mt_rand(1, 10));
file_put_contents(__FILE__.".txt", time());
All files are executed asynchronously.

Categories