ssh2_exec process failing before completion PHP Bash - php

I've got a PHP script that executes a bash script on a remote server which fires a number of processes.
<?php
$connection = ssh2_connect('address1.com', 22);
ssh2_auth_password($connection, 'user', 'pass');
$stream = ssh2_exec($connection, '/root/incoming/process.sh');
?>
The bash script process.sh works fine when executed locally on the remote server, no issues.
#!/bin/bash
wget -O /root/incoming/myfile.mp3 http://address2.com/myfile.mp3;
lame --decode /root/incoming/myfile.mp3 - | /usr/settings/stereo_tool_cmd_64 - - -s /usr/settings/setting.sts | lame -b 128 - /var/www/processed/myfile.mp3
But when I try to execute it remotely using the PHP script it bombs out at various stages of the first process (wget). It doesn't even complete the wget download, stopping at random stages of the transfer.
Is this an issue with PHP ssh2_exec?
Or am I missing something?

Found it after much hunting..
My PHP script doesn't require any feedback from the shell script, I just needed to start it and forget about it.
What solved my problem was the following :
$stream = ssh2_exec($connection, "/root/incoming/process.sh &> /dev/null &");
Hope it helps somebody else.

Add set_time_limit ( 0 ); to the beginning of the script. PHP usually defaults to timing out after 30 seconds.

Related

problem with execute command on multiple servers using php and bash on linux server

I'm setting up new app and want to execute script on all my servers (actually is 16). What I doing wrong?
my bash file:
for (( c=1; c<=16; c++ ))
do
URL="http://s$c.domain.com/api.php?script=$scriptl&a1=$port&a2=$AUTHKEY"
screen -dmS apprun$c wget -q $URL #for faster exec bash script i run on screen
done
;;
esac
my api.php:
<?php
if(isset ( $_GET['script']) ) {
$script = $_GET['script'];
$port = $_GET['a1'];
$cmd = "cd /home/scripts/ && perl $script $port";
shell_exec($cmd);
}
I expect to run all app instant and run on all server but now some instances run perfect and some dont run never, what can i change to speed this and fix?
It is difficult to tell, because there is no error message for the servers which won't run it. These machines might not have perl installed and therefore cannot interpret the script... or the script to run might be missing. Connect through ssh and try to run it manually - then you should see why it fails.

PHP exec() won't abort when running 'pdf2swf -Q 10'

I have a pdf file that makes the pdf2swf tool to run forever. when running the following command manually :
pdf2swf -Q 10 test.pdf
The script aborts after 10 seconds becouse of the -Q 10 flag, but when running the same command using php the script runs forever. I've tried using shell_exec() ,exec() and passthru() and all of them ignored the -Q flag.
Has anyone encontered anything like this with pdf2swf tool or with any other PHP exec ?
EDIT
when I run it manually
php -r "exec('pdf2swf -Q 10 test.pdf');"
It aborts after 10 seconds, but when running as a deamon, again, it won't abort
I tried same command in my test project and it worked fine with no issues, please find my code as -
$out = exec("pdf2swf font_example.pdf -o varun.swf", $rest);

PHP exec in background using & is not working

I am using this code on Ubuntu 13.04,
$cmd = "sleep 20 &> /dev/null &";
exec($cmd, $output);
Although it actually sits there for 20 seconds and waits :/ usually it works fine when using & to send a process to the background, but on this machine php just won't do it :/
What could be causing this??
Try
<?PHP
$cmd = '/bin/sleep';
$args = array('20');
$pid=pcntl_fork();
if($pid==0)
{
posix_setsid();
pcntl_exec($cmd,$args,$_ENV);
// child becomes the standalone detached process
}
echo "DONE\n";
I tested it for it works.
Here you first fork the php process and then exceute your task.
Or if the pcntl module is not availabil use:
<?PHP
$cmd = "sleep 20 &> /dev/null &";
exec('/bin/bash -c "' . addslashes($cmd) . '"');
The REASON this doesn't work is that exec() executes the string you're passing into it. Since & is interpreted by the shell as "execute in the background", but you don't execute a shell in your exec call, the & is just passed along with 20 to the /bin/sleep executable - which probably just ignores that.
The same applies to the redirection of output, since that is also parsed by the shell, not in exec.
So, you either need to find a way to fork your process (as described above), or a way to run the subprocess as a shell.
My workaround to do this on ubuntu 13.04 with Apache2 and any version of PHP:
libssh2-php, I just used nohup $cmd & inside a local SSH session using PHP and it ran it just fine the background, of course this requires putting certain security protocols in place, such as enabling SSH access for the webserver user, so it would have exec-like permissions then only allowing localhost to login to the webserver ssh account.

php script to kill shell 'sleep' process

I need to write some scripts for some automation work,
I put a php file on local apache server
test.php
<?php
system("bash inform.sh");
?>
the content of inform.sh is:
#!/bin/bash
proc_id=`ps -ef|grep "sleep"|grep -v "grep"|awk '{print $2}'`
kill -9 $proc_id
I run sleep process on a shell, and open the php page on firefox : localhost/test.php
but it doesn't kill the sleep process,
if I run the php directly through shell, then it works
what's wrong with this and how to deal with it? thanks
I use the following shell command and then it is OK:
sudo -u apache sleep 2000

How to check if a php script is still running

I have a PHP script that listens on a queue. Theoretically, it's never supposed to die. Is there something to check if it's still running? Something like Ruby's God ( http://god.rubyforge.org/ ) for PHP?
God is language agnostic but it would be nice to have a solution that works on windows as well.
I had the same issue - wanting to check if a script is running. So I came up with this and I run it as a cron job. It grabs the running processes as an array and cycles though each line and checks for the file name. Seems to work fine. Replace #user# with your script user.
exec("ps -U #user# -u #user# u", $output, $result);
foreach ($output AS $line) if(strpos($line, "test.php")) echo "found";
In linux run ps as follows:
ps -C php -f
You could then do in a php script:
$output = shell_exec('ps -C php -f');
if (strpos($output, "php my_script.php")===false) {
shell_exec('php my_script.php > /dev/null 2>&1 &');
}
The above code lists all php processes running in full, then checks to see if "my_script.php" is in the list of running processes, if not it runs the process and does not wait for the process to terminate to carry on doing what it was doing.
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php 2>&1 | mail -s "Daemon stopped" you#example.org
Edit:
Technically, this invokes the mailer right away, but only completes the command when the php script ends. Doing this captures the output of the php-script and includes in the mail body, which can be useful for debugging what caused the script to halt.
Simple bash script
#!/bin/bash
while [true]; do
if ! pidof -x script.php;
then
php script.php &
fi
done
Not for windows, but...
I've got a couple of long-running PHP scripts, that have a shell script wrapping it. You can optionally return a value from the script that will be checked in the shell-script to exit, restart immediately, or sleep for a few seconds -and then restart.
Here's a simple one that just keeps running the PHP script till it's manually stopped.
#!/bin/bash
clear
date
php -f cli-SCRIPT.php
echo "wait a little while ..."; sleep 10
exec $0
The "exec $0" restarts the script, without creating a sub-process that will have to unravel later (and take up resources in the meantime). This bash script wraps a mail-sender, so it's not a problem if it exits and pauses for a moment.
Here is what I did to combat a similar issue. This helps in the event anyone else has a parameterized php script that you want cron to execute frequently, but only want one execution to run at any time. Add this to the top of your php script, or create a common method.
$runningScripts = shell_exec('ps -ef |grep '.strtolower($parameter).' |grep '.dirname(__FILE__).' |grep '.basename(__FILE__).' |grep -v grep |wc -l');
if($runningScripts > 1){
die();
}
You can write in your crontab something like this:
0 3 * * * /usr/bin/php -f /home/test/test.php my_special_cron
Your test.php file should look like this:
<?php
php_sapi_name() == 'cli' || exit;
if($argv[1]) {
substr_count(shell_exec('ps -ax'), $argv[1]) < 3 || exit;
}
// your code here
That way you will have only one active instace of the cron job with my-special-cron as process key. So you can add more jobs within the same php file.
test.php system_send_emails sendEmails
test.php system_create_orders orderExport
Inspired from Justin Levene's answer and improved it as ps -C doesn't work in Mac, which I need in my case. So you can use this in a php script (maybe just before you need daemon alive), tested in both Mac OS X 10.11.4 & Ubuntu 14.04:
$daemonPath = "FULL_PATH_TO_DAEMON";
$runningPhpProcessesOfDaemon = (int) shell_exec("ps aux | grep -c '[p]hp ".$daemonPath."'");
if ($runningPhpProcessesOfDaemon === 0) {
shell_exec('php ' . $daemonPath . ' > /dev/null 2>&1 &');
}
Small but useful detail: Why grep -c '[p]hp ...' instead of grep -c 'php ...'?
Because while counting processes grep -c 'php ...' will be counted as a process that fits in our pattern. So using a regex for first letter of php makes our command different from pattern we search.
One possible solution is to have it listen on a port using the socket functions. You can check that the socket is still listening with a simple script. Even a monitoring service like pingdom could monitor its status. If it dies, the socket is no longer listening.
Plenty of solutions.. Good luck.
If you have your hands on the script, you can just ask him to set a time value every X times in db, and then let a cron job check if that value is up to date.
troelskn wrote:
Just append a second command after the script. When/if it stops, the second command is invoked. Eg.:
php daemon.php | mail -s "Daemon stopped" you#example.org
This will call mail each time a line is printed in daemon.php (which should be never, but still.)
Instead, use the double ampersand operator to separate the commands, i.e.
php daemon.php & mail -s "Daemon stopped" you#example.org
If you're having trouble checking for the PHP script directly, you can make a trivial wrapper and check for that. I'm not sufficiently familiar with Windows scripting to put how it's done here, but in Bash, it'd look like...
wrapper_for_test_php.sh
#!/bin/bash
php test.php
Then you'd just check for the wrapper like you'd check for any other bash script: pidof -x wrapper_for_test_php.sh
I have used cmder for windows and based on this script I came up with this one that I managed to deploy on linux later.
#!/bin/bash
clear
date
while true
do
php -f processEmails.php
echo "wait a little while for 5 secobds...";
sleep 5
done

Categories