Is there a way to end this background process executed by this code?
$process = exec('nohup java -jar '/selenium-server-standalone-3.8.0.jar -> /dev/null &');
I am simply running a jar file which is a server, I need to stop this server after some code is executed, is there a way to end it? something like
end($process);
die($process);
To launch your background process, use a command like this:
exec('nohup java jar "/selenium-server-standalone-3.8.0.jar" > /dev/null 2>&1 & echo $!', $pid);
This is redirecting stdout to /dev/null, and then stderr to stdout so that the process can actually continue in the background. After that, it is returning the processid of the new background process and saving it in $pid.
Then, later, you can stop it with exec('kill '.$pid);
Related
I'm running a php socket. I run the program through nohup. Run this program properly through root. But my problem is running the program via the exec () function in php. When I run the command this way the program runs correctly but the program output is not printed in nohup.out.
my command in ssh:
nohup php my_path/example.php & #is working
my command in user php:
exec('nohup php my_path/example.php >/dev/null 2>&1 &', $output); #not update nohup.out
please guide me...
From PHP docs on exec:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
From man nohup:
If standard input is a terminal, redirect it from /dev/null. If standard output is a terminal, append output to 'nohup.out' if possible, '$HOME/nohup.out' otherwise. If standard error is a terminal, redirect it to standard output. To save output to FILE, use 'nohup COMMAND > FILE'.
To satisfy both - redirect manually to nohup.out:
exec('nohup php my_path/example.php >>nohup.out 2>&1 &', $output);
I call a python script from PHP but i run script in background because i don't want to wait after script to finish, so i use > /dev/null 2>/dev/null & with shell_exec.
This is the code where i call:
shell_exec("C:\\Python27\\python C:\\xampp\\htdocs\\testing.py > /dev/null 2>/dev/null &");
in python script i have a simple write file:
fileName = "temp.txt"
target = open(fileName, 'w')
for x in range(1, 59):
target.write("test" + str(x) + "
")
target.close()
When i call script from PHP with shell_exec("C:\\Python27\\python C:\\xampp\\htdocs\\testing.py"); it's working but webpage it's waiting after script to finish and i don't want that, so i called with > /dev/null 2>/dev/null & but now, the code doesn't run.
Any ideas why code doesn't working when i want to run in background?
Thanks!
LE:
Changed /dev/null/ with nul and still no working.
Now i call shell_exec("C:\\Python27\\python C:\\xampp\\htdocs\\testing.py > NUL 2> NUL &"); and it's working but page still wating after script to finish.
The answer is:
pclose(popen("start /B ". $cmd, "r"));
You can start a program without cmd and PHP will not wait after program to finish.
I am attempting to launch sar and have it run forever via a php script. But for whatever reason it never actually launches. I have tried the following:
exec('sar -u 1 > /home/foo/foo.txt &');
exec('sar -o /home/foo/foo -u 1 > /dev/null 2>&1 &');
However it never launches sar. If I just use:
exec('sar -u 1')
It works but it just hangs the php script. My understanding that if a program is started with exec function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream.
I will assume your running this on a *nix platform. To get php to run something in the background and not wait for the process to finish I would recommend 2 things: First use nohup and also redirect the output of the command to /dev/null (trash).
Example:
<?php
exec('nohup sar -u 1 > /dev/null 2>/dev/null &');
nohup means we do not send the "hang up" signal (which kills the process) when the terminal running the command closes.
> /dev/null 2>/dev/null & redirects the "normal" and "error" outputs to the blackhole /dev/null location. This allows PHP to not have to wait for the outputs of the command being called.
On another note, if you are using PHP just to call a shell command, you may want to consider other options like Ubuntu's Upstart with no PHP component--if you are using Ubuntu that is.
I have the following exec() command with an & sign at the end so the script runs in the background. However the script is not running in the background. It's timing out in the browser after exactly 5.6 minutes. Also if i close the browser the script doesn't keep running.
exec("/usr/local/bin/php -q /home/user/somefile.php &")
If I run the script via the command line, it does not time out. My question is how do i prevent timeout. How do i run the script in the background using exec so it's not browser dependent. What am i doing wrong and what should i look at.
exec() function handle outputs from your executed program, so I suggest you to redirect outputs to /dev/null (a virtual writable file, that automatically loose every data you write in).
Try to run :
exec("/usr/local/bin/php -q /home/gooffers/somefile.php > /dev/null 2>&1 &");
Note : 2>&1 redirects error output to standard output, and > /dev/null redirects standard output to that virtual file.
If you have still difficulties, you can create a script that just execute other scripts. exec() follows a process when it is doing a task, but releases when the task is finished. if the executed script just executes another one, the task is very quick and exec is released the same way.
Let's see an implementation. Create a exec.php that contains :
<?php
if (count($argv) == 1)
{
die('You must give a script to exec...');
}
array_shift($argv);
$cmd = '/usr/local/bin/php -q';
foreach ($argv as $arg)
{
$cmd .= " " . escapeshellarg($arg);
}
exec("{$cmd} > /dev/null 2>&1 &");
?>
Now, run the following command :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php > /dev/null 2>&1 &");
If you have arguments, you can give them too :
exec("/usr/local/bin/php -q exec.php /home/gooffers/somefile.php x y z > /dev/null 2>&1 &");
You'll need to use shell_exec() instead:
shell_exec("/usr/local/bin/php -q /home/gooffers/somefile.php &");
That being said, if you have shell access, why don't you install this as a cronjob? I'm not sure why a PHP script is invoking another to run like this.
How to execute a php script from another ?
I want to execute 3 php scripts from my php file without waiting for the 3 scripts to finish. In other words, the 3 php files need to be executed all at once (parallel) instead of one-by-one (sequentiell).
The 3 scripts are in the same folder of my main php file (script).
If you do not want to wait for them to finish, run them with either
exec('php script.php &> /dev/null &');
shell_exec('php script.php &> /dev/null &');
system('php script.php &> /dev/null &');
`php script.php &> /dev/null &`
Any of those should accomplish the job, depending on your PHPs configuration. Although they are different functions, their behaviour should be similar since all output is being redirected to /dev/null and the proccess is immediately detached.
I use the first solution in a production environment where a client launches a bash SMSs sending script which can take up to 10 minutes to finish, it has never failed.
More info in: http://php.net/exec · http://php.net/shell_exec · http://php.net/system
how about using exec("php yourscript.php")
do consider using queuing system to store your php script names and worker to fetch data from queue and do the execution e.g. beanstalkd
You need to run them as detached jobs, and it is not really easy - or portable. The usual solution is to use nohup or exec the scripts with stdout and stderr redirected to /dev/null (or NUL in Windows), but this often has issues.
If possible, make the three scripts available as scripts on the web server, and access them through asynchronous cURL functions. This has also the advantage of being able to test the scripts through the browser, and supplying you the scripts output.
Other ways include using popen(), or if under Linux, the at or batch utility.
taken from http://board.phpbuilder.com/showthread.php?10351142-How-can-I-exec%28%29-in-a-non-blocking-fashion:
In order to execute a command have have it not hang your php script while it runs, the program you run must not output back to php. To do this, redirect both stdout and stderr to /dev/null, then background it.
> /dev/null 2>&1 &
In order to execute a command and have it spawned off as another process that is not dependent on the apache thread to keep running (will not die if somebody cancels the page) run this:
exec('bash -c "exec nohup setsid your_command > /dev/null 2>&1 &"');
For windows http://www.php.net/manual/en/function.exec.php:
function execInBackground($path, $exe, $args = "") {
global $conf;
if (file_exists($path . $exe)) {
chdir($path);
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start \"bla\" \"" . $exe . "\" " . escapeshellarg($args), "r"));
} else {
exec("./" . $exe . " " . escapeshellarg($args) . " > /dev/null &");
}
}
To the shell_exec, add the '/B' parameter, this allows you to run several executables at once.
See my answer at this question: PHP on a windows machine; Start process in background
It's the same.
shell_exec('start /B "C:\Path\to\program.exe"); /B parameter is key
here. I tried to find the topic for you again, but I can't seem to
find it anymore. This works for me.
I hope this solves the problem for you.