I'm trying to run my server from PHP script as a background process, but it's hanging the PHP script anyway. I call it like this:
$exec_result = exec('./myapp option1 option2 &> /dev/null &');
I tried things from PHP hanging while exec() bash script like adding "set -m && " or "shopt -u checkjobs && " but that doesn't help. I also tried to call in exec() my C++ utility that runs command in background (basically just calls std::system with " &"), but that didn't help either. Using "nohup" doesn't change anything. Also, the problem is not in my server because same thing happens when I call "sleep" command.
Calling exactly the same command from bash runs process in background as expected. Honestly I'm so confused and frustrated. What am I doing wrong? Maybe PHP needs some kind of permissions to run a background task? I'm kinda new to Linux.
I'm doing it all from Debian 10 and PHP 7.3 if it matters.
I've managed to fix it, but I have no idea why the new solution works while the old one doesn't. Maybe it has something to do with exec() build-it parser? Both lines work identically in bash so I'm blaming PHP on this.
So, I've replaced
$exec_result = exec('./myapp option1 option2 &> /dev/null &');
with
$exec_result = exec('./myapp option1 option2 > /dev/null 2>&1 &');
and that did it. I've checked it back and forth multiple times and the second line works consistently while the first one fails every time.
Related
Ok so I have a operating system Ubuntu. I have lampp. Now I wanna execute some command in the background which is going to take atleast 10-15 minutes to execute. I want to execute this command from PHP script from web interface not cli.
First Instance:
When I do this , the php script is successfully able to run the command in the background:
command1 &> /dev/null &
Second Instance:
But when I do this:
command1 &> /dev/null && command2 &
,
Then the command does not run in background, the php script pauses until this command executes. I want the command2 to execute after command1 is completed so that I (or my php script) can know that command1 has been executed. But it should be in background else my php script doesn't execute on time.
When I do the second instance from command line, it is able to run in background, but when I do it from php script then it is unable to run in background.
I am using the exec function of php to execute those commands.
<?php
exec('command1 &> /dev/null && command2 &',$out,$ret);
?>
Any help would be appreciated.
Try this:
<?php
exec('(command1 && command2) >/dev/null &',$out,$ret);
What it's doing is launching your commands in a subshell so that command1 runs first and then command2 only runs after command1 completes successfully, then redirects all the output to dev/null and runs the whole thing in the background.
If you want to run command2 regardless of the exit code of command1 use ; instead of &&.
The computer is doing what you're instructing it to do. The && list operator tells the shell to only run the second command if the first command completes successfully. Instead, you should run two independent commands; if there really is a dependency between the two commands, then you may need to reconsider your approach.
I would like to run a php script in the background as it takes much time and I don't want to wait for it.
This question relates to:
php execute a background process
I tried to make this work on my linux centos machine, but unfortunately it seems not to be working.
This is my code:
$cmd = 'php '.realpath($_SERVER["DOCUMENT_ROOT"]).'/test/sample.php';
exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile));
exec(sprintf("$s > $s 2>&1 & echo $1", $cmd, $outputfile),$pidArr);
sleep(2);
print_r($outputfile);
print_r($pidfile);
I included the sleep(2) to make sure the sample.php script is finished. The sample.php only contains echo 'Hello world'; I tried both exec options, but none of them worked. The above script doesn't show any output.
I tried to run the same $cmd in linux command line and it showed me the output from sample.php script.
I would like to make this run first, but in addition I would like to send variables to the sample.php script as well.
Your help is very appreciated.
You can use set_time_limit(0) to allow script to run indefinitely and ignore_user_abort() to keep running script in background, even if the browser of tab have been closed. This way if you redirect your browser to somewhere else, script will keep running in background.
This approach to running php scripts in the background seems a bit awkward. And depending on if you pass additional parameters based on user input, could also create a scenario where an attacker could inject additional commands.
For use cases like this, you could be using cron, if you don't need a response immediately, or a job manager like Gearman, where you can use a Gearman server to manage the communication between the we request and the background job.
I made some adjustments
$cmd = 'php '.realpath($_SERVER["DOCUMENT_ROOT"]).'/test/sample.php';
$outputfile = 'ouput.txt';
$pidfile = 'pid.txt';
exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile));
exec(sprintf("$s > $s 2>&1 & echo $1", $cmd, $outputfile),$pidArr);
sleep(2);
print_r(file_get_contents($outputfile));
print_r(file_get_contents($pidfile));
Not sure if you defined the variables before. I tested it on Windows and it works
I never used system commands so I would need a little help running a php background process:
Can I run an exec() command on a shared server environment?
Is it possible not to wait for it to finish in the file I am running it from?
Can I use parameters to pass to this file?
It depends on the server configuration.
Maybe if you use screen.
You can use parameters as you would run the command from command line.
http://php.net/manual/en/function.exec.php
1 Try and find out
2 use /dev/null
3 exec('php -f '.ROOT_PATH.'/index.php '.$this->['param'].' '.$param.' '.$param.' "'.preg_replace('/"/', '\"', serialize($array))."\" > /dev/null &");
I am using phpseclib to ssh to my server and run a python script. The python script is an infinite loop, so it runs until you stop it. When I execute python script.py via ssh with phpseclib, it works, but the page just loads for ever. It does this because phpseclib does not think it is "done" running the line of code that runs the infinite loop script so it hangs on that line. I have tried using exit and die after that line, but of course, it didnt work because it hangs on the line before, the one that executes the command. Does any one have any ideas on how I can fix this without modifying the python file? Thanks.
Assuming the command will be run by a shell, you could have it execute this to start it:
nohup python myscript.py > /dev/null 2>&1 &
If you put an & on the end of any shell command it will run in the background and return immediately, that's all you really need.
Something else you could have also done:
$ssh->setTimeout(1);
I need to run a command in PHP like this:
exec('dosomething > saveit.txt');
Except I don't want PHP to wait for it to be complete. I also don't want to throw away the output, and I don't want to use nohup because I'm using that for something else in the same directory.
I also tried pclose(popen('dosomething > saveit.txt','r')); and that didn't work, it still waited.
Add an ampersand to the end of the command, so:
exec('dosomething > saveit.txt &');
in the documentation of exec() there is an interesting comment that says:
Took quite some time to figure out the line I am going to post next. If you want to execute a command in the background without having the script waiting for the result, you can do the following:
<?php
passthru("/usr/bin/php /path/to/script.php ".$argv_parameter." >> /path/to/log_file.log 2>&1 &");
?>