running shell_exec in php causes web server to hang - php

I am running the following code. What it does is take a text file, splits it into parts that end with '_part' ending and than calls the same script with a flag to process the files - uploading the content to a Drupal system.
What happens is that the script runs and finishes the work, all invoked scripts finish too and I can see the results. but each time after I run it the web server stops responding. Is there anything basic that I am missing or doing wrong?
if(isset($argv[3])){
$isSplit = $argv[3] == 'true' ? true : false;
}
if($isSplit){
$fileSplitter = new CSVFileParts($fileName);
$parts = $fileSplitter->split_file();
echo 'Splited file to '.$parts.' parts'.PHP_EOL;
for($part =0; $part < $parts; $part++){
echo shell_exec('php Service.php u ./partial_files/'.basename($fileName).'.part_'.$part.' false > /dev/null 2>/dev/null &');
}
}else{
$log->lwrite('uploading '.$argv[2]);
$drupalUploader = new DrupalUploader($fileName, $log);
$drupalUploader->upload();
}

shell_exec — Execute command via shell and return the complete output as a string
shell_exec expects the file handle to be open, but you redirect everything to /dev/null and detach it.
As you plan to detach the process and remove all the output, you should use exec() and escapeshellcmd()
see: http://www.php.net/manual/en/function.exec.php

Related

PHP - exec() hang the web application until the command finished [duplicate]

I've got a PHP script that needs to invoke a shell script but doesn't care at all about the output. The shell script makes a number of SOAP calls and is slow to complete, so I don't want to slow down the PHP request while it waits for a reply. In fact, the PHP request should be able to exit without terminating the shell process.
I've looked into the various exec(), shell_exec(), pcntl_fork(), etc. functions, but none of them seem to offer exactly what I want. (Or, if they do, it's not clear to me how.) Any suggestions?
If it "doesn't care about the output", couldn't the exec to the script be called with the & to background the process?
EDIT - incorporating what #AdamTheHut commented to this post, you can add this to a call to exec:
" > /dev/null 2>/dev/null &"
That will redirect both stdio (first >) and stderr (2>) to /dev/null and run in the background.
There are other ways to do the same thing, but this is the simplest to read.
An alternative to the above double-redirect:
" &> /dev/null &"
I used at for this, as it is really starting an independent process.
<?php
`echo "the command"|at now`;
?>
To all Windows users: I found a good way to run an asynchronous PHP script (actually it works with almost everything).
It's based on popen() and pclose() commands. And works well both on Windows and Unix.
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
Original code from: http://php.net/manual/en/function.exec.php#86329
On linux you can do the following:
$cmd = 'nohup nice -n 10 php -f php/file.php > log/file.log & printf "%u" $!';
$pid = shell_exec($cmd);
This will execute the command at the command prompty and then just return the PID, which you can check for > 0 to ensure it worked.
This question is similar: Does PHP have threading?
php-execute-a-background-process has some good suggestions. I think mine is pretty good, but I'm biased :)
In Linux, you can start a process in a new independent thread by appending an ampersand at the end of the command
mycommand -someparam somevalue &
In Windows, you can use the "start" DOS command
start mycommand -someparam somevalue
the right way(!) to do it is to
fork()
setsid()
execve()
fork forks, setsid tell the current process to become a master one (no parent), execve tell the calling process to be replaced by the called one. so that the parent can quit without affecting the child.
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
// parent's stuff
exit();
I used this...
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
* Relies on the PHP_PATH config constant.
*
* #param string $filename file to execute
* #param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec(PHP_PATH . " -f {$filename} {$options} >> /dev/null &");
}
(where PHP_PATH is a const defined like define('PHP_PATH', '/opt/bin/php5') or similar)
It passes in arguments via the command line. To read them in PHP, see argv.
I also found Symfony Process Component useful for this.
use Symfony\Component\Process\Process;
$process = new Process('ls -lsa');
// ... run process in background
$process->start();
// ... do other things
// ... if you need to wait
$process->wait();
// ... do things after the process has finished
See how it works in its GitHub repo.
The only way that I found that truly worked for me was:
shell_exec('./myscript.php | at now & disown')
You can also run the PHP script as daemon or cronjob: #!/usr/bin/php -q
Use a named fifo.
#!/bin/sh
mkfifo trigger
while true; do
read < trigger
long_running_task
done
Then whenever you want to start the long running task, simply write a newline (nonblocking to the trigger file.
As long as your input is smaller than PIPE_BUF and it's a single write() operation, you can write arguments into the fifo and have them show up as $REPLY in the script.
without use queue, you can use the proc_open() like this:
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("pipe", "w") //here curaengine log all the info into stderror
);
$command = 'ping stackoverflow.com';
$process = proc_open($command, $descriptorspec, $pipes);
I can not use > /dev/null 2>/dev/null & on Windows, so I use proc_open instead. I run PHP 7.4.23 on Windows 11.
This is my code.
function run_php_async($value, $is_windows)
{
if($is_windows)
{
$command = 'php -q '.$value." ";
echo 'COMMAND '.$command."\r\n";
proc_open($command, [], $pipe);
}
else
{
$command = 'php -q '.$value." > /dev/null 2>/dev/null &";
echo 'COMMAND '.$command."\r\n";
shell_exec($command);
}
}
$tasks = array();
$tasks[] = 'f1.php';
$tasks[] = 'f2.php';
$tasks[] = 'f3.php';
$tasks[] = 'f4.php';
$tasks[] = 'f5.php';
$tasks[] = 'f6.php';
$is_windows = true;
foreach($tasks as $key=>$value)
{
run_php_async($value, $is_windows);
echo 'STARTED AT '.date('H:i:s')."\r\n";
}
In each files to be execute, I put delay this:
<?php
sleep(mt_rand(1, 10));
file_put_contents(__FILE__.".txt", time());
All files are executed asynchronously.

How can I script a 'shutdown -r 1' and return an exit status?

I am writing a program that will at some point call a shell script. I need this shell script (bash, or if necessary PHP 4+ will work) to be called by the program, and return an exit status that I can relay before the 1 minute is reached and the system reboots.
Here's an idea of what I mean, best as I can describe:
Program calls 'reboot' script
Reboot script runs 'shutdown -r 1' and then exits with a status of 0
Program echo's out the exit status
Server reboots
I can get everything to work except the exit status - no matter what I try the program never exits its loop waiting for an exit status, so it never returns anything but the reboot still occurs. This program runs other scripts that return exit statuses, so I need this one to as well to maintain functionality and all that...
Any help is appreciated!
EDIT- The program that calls the reboot script is a PHP script that runs in a loop. When certain events happen, the program runs certain scripts and echos out the exit status. All of them work but this - it never returns an exit status.
Scripts are being called using system($cmd) where $cmd is './scriptname.sh'
Assuming you're opening the process using proc_open, then calling proc_get_status should return an array that has the exit code in it.
You could create a bash script that backgrounds the shutdown process:
#!/bin/bash
shutdown -r 1 &
exit 0
This returns control to the parent shell, which receives "0" as the exit code.
Unfortunately, you can't rely on PHP's system() and exec() functions to retrieve the proper return value, but with a nice little workaround in BASH, it's possible to parse exit code really effectively:
function runthis($command) {
$output = array();
$retcode = -1;
$command .= " &2>1; echo $?";
exec($command, $output, $retcode);
$retcode = intval(array_pop($output));
return $retcode;
}
if (runthis("shutdown -r 1") !== 0) echo "Command failed!\n";
Let me break down what does the code doing:
$command .= " &2>1; echo $?"; - expand the command so we pipe the stderr into stdout, then run echo $?
echo $? - this special bash parameter which expands to the last executed command's exit code.
exec($command, $output, $retcode); - execute the command. ($retcode is just a placeholder here since the returned data isn't trustworthy. We'll overwrite it later.) The command's output will be written in $output as an array. Every element will represent an individual row.
$retcode = intval(array_pop($output)); - parse the last row as an integer. (since the last command will be echo $?, it will be always the actual exitcode.
And that's all you need! Although it's a really crude code, and prone to errors if not used correctly, it's perfect for executing simpler tasks, and it will always give you the proper exit code.
For more professional (and programmatic) approach, you have to dig yourself into PHP's pnctl, posix, stream functions, and also Linux pipe handling.

How do you interrupt a process after starting it with exec()?

If I start a process with exec(), how can I later terminate that process, with say pressing/sending the "q" key. Right now, when I execute the process, PHP will hang until it finishes and returns.
function PsExec($commandJob) {
$command = $commandJob.' > /dev/null 2>&1 & echo $!';
exec($command ,$op);
$pid = (int)$op[0];
if($pid!="") return $pid;
return false;
}
later on...
exec("kill -9 $pid", $output);
if you want your script to exit regardless of wheter the exec is done, you can redirect the out put to another file,
exec("php dothis.php >> file.log");
i hope that helps you
You could use pcntl_fork and pcntl_exec to launch your program and posix_kill to terminate it.
If I start a process with exec(), how can I later terminate that process, with say pressing/sending the "q" key.
Instead of using exec, you could look at using proc_open, which requires that you pass in an array specifying three streams -- one for stdin, one for stdout and one for stderr.
This will let you easily feed input into the program while processing output, without blocking just waiting for it to execute. You can later use proc_terminate to viciously murder it, if needed.

Continually running PHP script using Bash

I have a long running PHP script that has a memory leak which causes it to fail part way through. The script uses a 3rd party library and I haven't been able to find the source of the leak.
What I would like to do is create a bash script that continually runs the PHP script, processing 1000 records at a time, until the script returns an exit code saying it has finished processing all records. I figure this should help me get around the memory leak because the script would run for 1000 records, exit, and then a new process would be started for another 1000 records.
I'm not that familiar with Bash. Is this possible to do? How do I get the output from the PHP script?
In pseudocode, I was thinking of something along the lines of:
do:
code = exec('.../script.php')
# PHP script would print 0 if all records are processed or 1 if there is more to do
while (code != 0)
$? gives you the exit code of a program in bash
You can do something ilke
while /bin/true; do
php script.php
if [ $? != 0 ]; then
echo "Error!";
exit 1;
fi
done
You can probably even do:
while php script.php; do
echo "script returned success"
done
Do you have to use bash? You could probably do this with PHP:
while (true) {
$output = exec('php otherscript.php', $out, $ret);
}
The $ret variable will contain the exit code of the script.
Use a simple until loop to automatically test the exit status of the PHP script.
#!/bin/sh
until script.php
do
:
done
The colon is merely a null operator, since you don't actually want to do anything else in the loop.
until while execute the command script.php until it returns zero (aka true). If the script returned 0 to indicate not done instead of 1, you could use while instead of until.
The output of the PHP script would go to standard out and standard error, so you could wrap the invocation of the shell script with some I/O re-direction to stash the output in a file. For example, if the script is called loop.sh, you'd just run:
./loop.sh > output.txt
though of course you can control the output file directly in the PHP script; you just have to remember to open the file for append.
You might want to ask a separate question about how to debug the PHP memory leak though :-)
You can write:
#!/bin/bash
/usr/bin/php prg.php # run the script.
while [ $? != 0 ]; do # if ret val is non-zero => err occurred. So rerun.
/usr/bin/php prg.php
done
Implemented solution in PHP instead as:
do {
$code = 1;
$output = array();
$file = realpath(dirname(__FILE__)) . "/script.php";
exec("/usr/bin/php {$file}", $output, $code);
$error = false;
foreach ($output as $line) {
if (stripos($line, 'error') !== false) {
$error = true;
}
echo $line . "\n";
}
} while ($code != 0 && !$error);

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Categories