I have a PHP script that executes an external bash script to make an SSH connection but even though i am using ssh's move to background (-f) as well as an '&' my PHP script hangs.
Problem line in PHP script
system('/usr/local/bin/startProxy 172.16.0.5 9051');
I have also tried :
system('/usr/local/bin/startProxy 172.16.0.5 9051 &');
And the startProxy script is simply :
#!/bin/bash
#
# startProxy <IP_Address> <Proxy_Port>
#
# Starts an ssh proxy connection using -D <port> to remote system
#
ssh -o ConnectTimeout=5 -f -N -D $2 $1 &
Calling the startProxy script from command line works find and the script returns immediately.
When the system() command is run, the called script does run (I can see ssh running via ps), it just never appears to return to the PHP script.
I have run into the same issue when trying to call ssh directly via the system() method as well.
Thanks #Martin,
For future self and others, I had to change the system call to
system('/usr/local/bin/startProxy 172.16.0.5 9051 2>&1 >/dev/null');
and to change the startProxy script to :
ssh -o ConnectTimeout=5 -f -N -D $2 $1 2>&1 >/dev/null
before PHP return to the rest of the script!
Anyone able to explain why PHP stops like this if output is not redirected (I presume PHP isn't scanning the command and seeing the redirection part) and also why PHP hangs is i don't include the redirection in the ssh command within startProxy, dispite the fact that both the PHP system call and the ssh command in startProxy where using background options (-f and '&' )
Related
I'm running a php socket. I run the program through nohup. Run this program properly through root. But my problem is running the program via the exec () function in php. When I run the command this way the program runs correctly but the program output is not printed in nohup.out.
my command in ssh:
nohup php my_path/example.php & #is working
my command in user php:
exec('nohup php my_path/example.php >/dev/null 2>&1 &', $output); #not update nohup.out
please guide me...
From PHP docs on exec:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
From man nohup:
If standard input is a terminal, redirect it from /dev/null. If standard output is a terminal, append output to 'nohup.out' if possible, '$HOME/nohup.out' otherwise. If standard error is a terminal, redirect it to standard output. To save output to FILE, use 'nohup COMMAND > FILE'.
To satisfy both - redirect manually to nohup.out:
exec('nohup php my_path/example.php >>nohup.out 2>&1 &', $output);
I have some PHP files. Each of them start a socket listener, or run an infinite loop. The scripts halt when are executed via php command:
php sock_listener.php ...halt there
php listener2.php ... halt there
...
Currently I use screen command to start all the listener PHP files every time the machine is rebooted. Is there a way I can start all the listener PHP files in single shell line so that I can write a shell script to make it easier to use?
Using screen
Create a detached screen session for the first script:
session='php-test'
screen -S "$session" -d -m -t A php a.php
where -d -m combination causes screen to create a detached session.
Run the rest of the scripts in the same session in separate windows:
screen -S "$session" -X screen -t B php b.php
screen -S "$session" -X screen -t C php c.php
where
-X sends the built-in screen command to the running session;
-t sets the window title.
The session will be available in the output of screen -ls command:
There is a screen on:
8951.php-test (Detached)
Connect to the session using -r option, e.g.:
screen -r 8951.php-test
List the windows within the screen session with Ctrl-a " shortcut, or windowlist -b command.
Forking Processes to Background
A less convenient way is to send the commands to background by appending an ampersand at the end of each command:
nohup php a.php 2>a.php.err >a.php.out &
nohup php b.php 2>b.php.err >b.php.out &
nohup php c.php 2>c.php.err >c.php.out &
where
nohup prevents termination of the commands, if the user logs out of the shell. Read this tutorial for more information;
2>a.php.err redirects the standard error to a.php.err file;
>a.php.out redirects the standard output to a.php.out file.
Is there a way I can start all the listener PHP files in single shell line so that I can write a shell script to make it easier to use?
You can put the above-mentioned commands into a shell script file, e.g.:
#!/bin/bash -
# Put the commands here
make it executable:
chmod +x /path/to/script
and call it when you need it:
/path/to/script
Modify the shebang as appropriate.
Just run them under circus. Circus will let you define a number of processes and how many instances you want to run, and just keep them running.
https://circus.readthedocs.io/en/latest/
I have a bash script that takes a parameter is called in PHP by shell_exec(script.sh parameter). Basically, my goal is to call a script that is owned by another user that is not apache.
The script.sh script is a file that contains the following (right now there are some error handling commands):
#/bin/bash
whoami>>whoami
echo $1 >> parameter
while read f; do
env>>envoutput
sudo -i -u archivescriptowner /path/to/archivescript.sh -command archive >> output
done < $1
In my /etc/sudoers file , I have the following:
apache ALL=(archivescriptowner) NOPASSWD: /bin/bash -c /path/to/archivescript.sh *
When I run this script as by running su -s /bin/bash apache and pass a parameter, it works.
When I run it via my button in php, archivescript.sh does not execute
The whoami file has apache written to it
The parameter file has the right file written to it
env shows the following
Term=xterm
LD_LIBRARY_PATH=/path/to/library
PATH=/sbin/:usr/sbin:/bin:/usr/bin
PWD=/var/www/html
LANG=C
SHLVL=4
=/bin/env
PWD is outputting right, that is where my script is right now, it will be moved in the future.
The output file when it is ran by the button click is blank.
I am at a loss as to why this is not working. Any insight would be helpful. Please let me know if I need to give any additional information.
I recently published a project that allows PHP to obtain and interact with a real Bash shell. Get it here: https://github.com/merlinthemagic/MTS
After downloading you would simply use the following code:
$shell = \MTS\Factories::getDevices()->getLocalHost()->getShell('bash', true);
$return1 = $shell->exeCmd('/path/to/archivescript.sh');
echo $return1; //return from your script
I'm using PHP's shell_exec() to call a bash script, and I've identified the line that is is hanging up on:
I'm using WinExe in that line, and the line that hangs up is this (sensitive values removed obviously):
result=`${LOCATION}/bin/winexe -U "user%password" //gateway "g:\\folder\\myscript.bat $1 $2"`
If I call this script from the terminal, it works perfectly fine, but if I call it from PHP, the web server hangs and doesn't come back to normal until I kill the processes using:
fuser -k -n tcp 80
Solved via:
https://stackoverflow.com/a/6016750/270302
I basically use proc_open instead of shell_exec
I need to write a bash script that wraps a php script,
I have some variables that needs to be forwarded to the php script and some variables that I need internally for the bash script it self.
The call for the shell script should look like that but the php file can have more params so it needs to be generic :
bash /tmp/test.sh -c -l /tmp/aaa -php aaa.php -d -p 3 -f 2012-10-23
the -php option is mandatory because it contains the php file that needs to be called the -c and -l are optional flag and needs to be used internally for the bash script, everything after the aaa.php are params for the php file.
bash /tmp/test.sh -c -l /tmp/aaa -php "aaa.php -d -p 3 -f 2012-10-23" ?
Don't know how you are passing anythin inside bash script, bu basically everything wrapped in apostrophes is treated as one argument in bash script, passed further to php interpreter is seen as separated parameters, unless of course you don't wrap it again.