Calling bash script from PHP fails, but it works locally - php

I'm using PHP's shell_exec() to call a bash script, and I've identified the line that is is hanging up on:
I'm using WinExe in that line, and the line that hangs up is this (sensitive values removed obviously):
result=`${LOCATION}/bin/winexe -U "user%password" //gateway "g:\\folder\\myscript.bat $1 $2"`
If I call this script from the terminal, it works perfectly fine, but if I call it from PHP, the web server hangs and doesn't come back to normal until I kill the processes using:
fuser -k -n tcp 80

Solved via:
https://stackoverflow.com/a/6016750/270302
I basically use proc_open instead of shell_exec

Related

End Process Created by PHP exec in Ubuntu using Apache

I have an Ubuntu VM running in VirtualBox that hosts a server using Apache. The concept of the server is to accept HTTP POST requests, store them in a MySQL database and then execute a Python script with the relevant POST data to be displayed in a Discord channel.
The process itself is working but each time the PHP script calls the Python script, a new process is created that never actually ends. After a few hours of receiving live data the server runs out of available memory due to the amount of lingering processes. The PHP script has the following exec call as the last line of code;
exec("python3 main.py $DATA");
I would like to come up with a way to actually kill the processes created from this exec command (using user www-data), either in the Python file after the script is executed or automatically with an Apache setting that I probably just do not know about.
When running the following command in a terminal I can see the different processes;
ps -o pid,user,%mem,command ax | sort -b -k3 -r
There are 3 separate processes that show up, 1 referencing the actual python3 exec command as marked up in PHP;
9903 www-data 0.4 python3 main.py DATADATADATADATADATADATA
Then another process showing the more common -k start commands;
9907 www-data 0.1 /usr/sbin/apache2 -k start
And lastly another process very similar to the PHP exec command;
9902 www-data 0.0 sh -c python3 main.py DATADATADATADATADATADATA
How can I ensure Apache cleans these processes up - OR what do I need to add into my Python or PHP code to appropriately exec a Python script without leaving behind processes?
Didn't realize the exec command in php would wait for a return output indefinitely. Added this to the end of the string I was using in my exec call; > /dev/null &
i.e.: exec("python3 main.py $DATA > /dev/null &");

PHP script hangs when calling external command from CLI

I have a PHP script that executes an external bash script to make an SSH connection but even though i am using ssh's move to background (-f) as well as an '&' my PHP script hangs.
Problem line in PHP script
system('/usr/local/bin/startProxy 172.16.0.5 9051');
I have also tried :
system('/usr/local/bin/startProxy 172.16.0.5 9051 &');
And the startProxy script is simply :
#!/bin/bash
#
# startProxy <IP_Address> <Proxy_Port>
#
# Starts an ssh proxy connection using -D <port> to remote system
#
ssh -o ConnectTimeout=5 -f -N -D $2 $1 &
Calling the startProxy script from command line works find and the script returns immediately.
When the system() command is run, the called script does run (I can see ssh running via ps), it just never appears to return to the PHP script.
I have run into the same issue when trying to call ssh directly via the system() method as well.
Thanks #Martin,
For future self and others, I had to change the system call to
system('/usr/local/bin/startProxy 172.16.0.5 9051 2>&1 >/dev/null');
and to change the startProxy script to :
ssh -o ConnectTimeout=5 -f -N -D $2 $1 2>&1 >/dev/null
before PHP return to the rest of the script!
Anyone able to explain why PHP stops like this if output is not redirected (I presume PHP isn't scanning the command and seeing the redirection part) and also why PHP hangs is i don't include the redirection in the ssh command within startProxy, dispite the fact that both the PHP system call and the ssh command in startProxy where using background options (-f and '&' )

Use PHP over HTTP to run background bash with trap on exit

I'm trying to use PHP to trigger a bash script that should never stop running. It's not just that the command needs to run and I don't need to wait for output, it needs to continue running after PHP is finished. This has worked other times (and the question has been asked already), the difference seems to be my bash script has a trap for when it's closed.
Here is my bash script:
#!/bin/bash
set -e
WAIT=5
FILE_LOCK="$1"
echo "Daemon started (PID $$)..."
echo "$$" > "$FILE_LOCK"
trap cleanup 0 1 2 3 6 15
cleanup()
{
echo "Caught signal..."
rm -rf "$FILE_LOCK"
exit 1
}
while true; do
# do things
sleep "$WAIT"
done
And here is my PHP:
$command = '/path/to/script.sh /tmp/script.lock >> /tmp/script.log 2>&1 &';
$lastLine = exec($command, $output, $returnVal);
I see the script run, the lock file get created, then it exits, and the trap removes the lock file. In my /tmp/script.log I see:
Daemon started (PID 55963)...
Caught signal...
What's odd is that this only happens when running the PHP via Apache. From command line it keeps running as expected.
The signal on the trap that's being caught is 0.
I've tried wrapping my command in a bash environment, like $command = '/bin/bash -c "' . addslashes($command) . '"';, also tried adding nohup to the beginning. Nothing seems to be working. Is this possible to do for a never ending script?
Found the problem thanks to #lxg.
My # do things command was giving errors, which was causing the script to exit. For some reason they were suppressed.
When removing set -e from the beginning of my bash script I started seeing the errors output to my log file. Not sure why they didn't show up before.
The issue was in my bash loop it was running PHP commands. Even though my bash user and Apache user are the same, for some reason they had different $PATHs. This meant that when running on command line I was using a PHP7 binary, but when Apache trigged bash commands it was using a PHP5 binary (even though Apache itself is configured to use PHP7). So the application errored out and that is what caused the script to die.
The solution was to explicitly set the PHP binary path in my bash loop.
I was doing this with
BIN_PHP=$(which php)
But on true command line it would return one value (/path/to/php7/bin/php) vs command line initiated by Apache (/path/to/php5/bin/php). Despite Apache being the same as a my command line user, it didn't load the ~/.bashrc which specified my correct PHP path.

How to plan job in php script via exec and 'at'

I try to plan one-time job with 'at' command. There is next code in script:
$cmd = 'echo "/usr/bin/php '.$script_dir.$script_name.' '.$args.'"|/usr/bin/at "'.$time.'" 2>&1';
exec($cmd, $output , $exit_code);
When I run this command from script it adds the job to the schelude. This I see by the line in logs job 103 at Thu Sep 3 15:08:00 2015 (same text contains $output). But then nothing happens in specified time like at ignores the job. And there are no error messages in logs.
When I run same command with same args from command line on server it scheludes the job and than runs it at specified time.
I found out that when I try to plan a job via php script it runs under apache user. I tried to run next in command line on server:
sudo -u apache echo "/usr/bin/php /var/www/pant/data/www/pant.com/scripts/Run.php firstarg secondarg "|/usr/bin/at "16:00 03.09.2015"
It works correct too. I checked sudoers and have added apache user with NOPASSWD privileges. Script Run.php has execute rights.
at.deny is empty. at.allow does not exist.
So question is: why 'at' does not run command given via php script (exec) but runs same command in command line? How to run it?
Thanks to all.
I found by chance answer at stackexchange.com:
The "problem" is typically PHP is intended to run as module in a webserver. You may need to install the commandline version of php before you can run php scripts from the commandline

PHP exec in background using & is not working

I am using this code on Ubuntu 13.04,
$cmd = "sleep 20 &> /dev/null &";
exec($cmd, $output);
Although it actually sits there for 20 seconds and waits :/ usually it works fine when using & to send a process to the background, but on this machine php just won't do it :/
What could be causing this??
Try
<?PHP
$cmd = '/bin/sleep';
$args = array('20');
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
echo "DONE\n";
I tested it for it works.
Here you first fork the php process and then exceute your task.
Or if the pcntl module is not availabil use:
<?PHP
$cmd = "sleep 20 &> /dev/null &";
exec('/bin/bash -c "' . addslashes($cmd) . '"');
The REASON this doesn't work is that exec() executes the string you're passing into it. Since & is interpreted by the shell as "execute in the background", but you don't execute a shell in your exec call, the & is just passed along with 20 to the /bin/sleep executable - which probably just ignores that.
The same applies to the redirection of output, since that is also parsed by the shell, not in exec.
So, you either need to find a way to fork your process (as described above), or a way to run the subprocess as a shell.
My workaround to do this on ubuntu 13.04 with Apache2 and any version of PHP:
libssh2-php, I just used nohup $cmd & inside a local SSH session using PHP and it ran it just fine the background, of course this requires putting certain security protocols in place, such as enabling SSH access for the webserver user, so it would have exec-like permissions then only allowing localhost to login to the webserver ssh account.

Categories