I'm trying to terminate a shell script using PHP code.
I have created a shell script foo.sh which calls a PHP file
#! /bin/bash
cd /var/www/html/test
php test.php
Following is test.php
<?php
exec('exit 0');
?>
For some reason the shell script is not exiting.
Your exit(0) in PHP will terminate process with PHP itself, and not parent process (if any).
To terminate your bash script, you will need to find it's pid via ps and grep, or, alternatively, use killall:
system('killall foo.sh');
-in PHP
Related
I'm running a php socket. I run the program through nohup. Run this program properly through root. But my problem is running the program via the exec () function in php. When I run the command this way the program runs correctly but the program output is not printed in nohup.out.
my command in ssh:
nohup php my_path/example.php & #is working
my command in user php:
exec('nohup php my_path/example.php >/dev/null 2>&1 &', $output); #not update nohup.out
please guide me...
From PHP docs on exec:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
From man nohup:
If standard input is a terminal, redirect it from /dev/null. If standard output is a terminal, append output to 'nohup.out' if possible, '$HOME/nohup.out' otherwise. If standard error is a terminal, redirect it to standard output. To save output to FILE, use 'nohup COMMAND > FILE'.
To satisfy both - redirect manually to nohup.out:
exec('nohup php my_path/example.php >>nohup.out 2>&1 &', $output);
I have a PHP script that executes an external bash script to make an SSH connection but even though i am using ssh's move to background (-f) as well as an '&' my PHP script hangs.
Problem line in PHP script
system('/usr/local/bin/startProxy 172.16.0.5 9051');
I have also tried :
system('/usr/local/bin/startProxy 172.16.0.5 9051 &');
And the startProxy script is simply :
#!/bin/bash
#
# startProxy <IP_Address> <Proxy_Port>
#
# Starts an ssh proxy connection using -D <port> to remote system
#
ssh -o ConnectTimeout=5 -f -N -D $2 $1 &
Calling the startProxy script from command line works find and the script returns immediately.
When the system() command is run, the called script does run (I can see ssh running via ps), it just never appears to return to the PHP script.
I have run into the same issue when trying to call ssh directly via the system() method as well.
Thanks #Martin,
For future self and others, I had to change the system call to
system('/usr/local/bin/startProxy 172.16.0.5 9051 2>&1 >/dev/null');
and to change the startProxy script to :
ssh -o ConnectTimeout=5 -f -N -D $2 $1 2>&1 >/dev/null
before PHP return to the rest of the script!
Anyone able to explain why PHP stops like this if output is not redirected (I presume PHP isn't scanning the command and seeing the redirection part) and also why PHP hangs is i don't include the redirection in the ssh command within startProxy, dispite the fact that both the PHP system call and the ssh command in startProxy where using background options (-f and '&' )
I have a series of shell commands I want to put in a program and execute the program from the command line. I decided to use PHP for this so currently I am trying to get the most basic shell commands to run.
Save as build.php
<?php
shell_exec('cd ..');
echo "php executed\n";
?>
From the command-line
php build.php
Output
php executed
Php executes correctly but I'm still in the same directory. How do I get shell_exec( ... ) to successfully call a shell command?
You need to change the cwd (current working directory) in PHP... any cd command you execute via exec() and its sister functions will affect ONLY the shell that the exec() call invokes, and anything you do in the shell.
<?php
$olddir = getcwd();
chdir('/path/to/new/dir'); //change to new dir
exec('somecommand');
will execute somecommand in /path/to/new/dir. If you do
<?php
exec('cd /path/to/new/dir');
exec('somecommand');
somecommand will be executed in whatever directory you started the PHP script from - the cd you exec'd just one line ago will have ceased to exist and is essentially a null operation.
Note that if you did something like:
<?php
exec('cd /path/to/new/dir ; somecommand');
then your command WOULD be executed in that directory... but again, once that shell exits, the directory change ceases to exist.
I try to invoke an app inside php:
ob_start();
passthru("(cd /opt/server/TrackServer/release && ./TrackServer& ) && pidof TrackServer");
$pid = ob_get_clean();
var_dump($pid);
exit;
The goal is to run TrackServer within its path and to get it's process id so I can close it after I do some test.
When I run the command in terminal:
(cd /opt/server/TrackServer/release && nohup ./TrackServer&) && pidof TrackServer
I get correct pid returned but in php the command stops and doesn't go further, the TrackServer is started and running but I have to kill it from terminal to unblock the php script, after killing the process the php script prints the correct pid for the process I've just closed from terminal.
Why the command stops?
Is there a way to make it run in php the way I'am trying to run it (without forking to a new thread)?
From the passthru manual page: The passthru() function is similar to the exec() function in that it executes a command.
What this means is that you can't execute your command line directly, as this runs several commands and relies on the shell to implement backgrounding and subshells as needed.
Try this instead:
passthru("/bin/bash -c 'cd /opt/server/TrackServer/release && nohup ./TrackServer& && pidof TrackServer'");
EDIT:
I found a working solution:
ob_start();
passthru("/bin/bash -c 'cd /opt/server/TrackServer/release && nohup ./TrackServer&' > /dev/null 2>&1 &");
passthru("pidof TrackServer");
$pid = ob_get_clean();
The command was stopping because:
Run multiple exec commands at once (But wait for the last one to finish)
PHP's exec function will always wait for a response from your
execution. However you can send the stdout & stderror of the process
to /dev/null (on unix) and have these all the scripts executed almost
instantly.
I have 5 php scripts need to be executed one after other. How can i create a single php/batch script which executes all the php scripts in sequence.
I want to schedule a cron which run that php file.
#!/path/to/your/php
<?php
include("script1.php");
include("script2.php");
include("script3.php");
//...
?>
Alternative
#/bin/bash
/path/to/your/php /path/to/your/script1.php
/path/to/your/php /path/to/your/script2.php
/path/to/your/php /path/to/your/script3.php
# ...
If your scripts need to be accessed via http:
#/bin/bash
wget --quiet http://path/to/your/script1.php > /dev/null 2>&1
wget --quiet http://path/to/your/script2.php > /dev/null 2>&1
wget --quiet http://path/to/your/script3.php > /dev/null 2>&1
# ...
I did something for testing using the "wait" command, as it seems I just put wait between the calls to each script. I did a little test creating databases in php scripts, returning some records in a bash script, then updating with another php script, then another bash script to return the updated results and it seemed to work fine...
From what I have read, as long as the subscript doesn't call another subscript, the master script will wait if "wait" command is used between script calls.
Code is as below:
#!/bin/sh
/usr/bin/php test.php
wait
/bin/bash test.sh
wait
/usr/bin/php test2.php
wait
/bin/bash test2.sh
wait
echo all done
Hope it would execute all the php scripts in sequence.