PHP ssh2_shell execute command and get response - php

I am connecting to an ssh server and using ssh2_shell to create an interactive bash shell.
The problem I have is how to send a command to the shell and get the response for the command accurately.
This is what I am currently doing:
$stream = ssh2_connect();
$shell = ssh2_shell($stream, 'bash');
fwrite($shell, "whoami\n");
fwrite(STDOUT, fread($shell, 2048) . PHP_EOL);
The problem with the above code is the output to STDOUT (for commandline view) shows something like Last login to server etc.. aka the welcome message. I have to wait a second or two and then read from the shell again to get the output of the command I sent.
Now I can solve this by doing sleep(1) after any command I send but this is very hacky. If the command takes longer than a second, I will not get the results. If the command takes less than a second, I waste time.
I tried stream_set_blocking($shell, true) before writing the command and calling fread but the problem still persists unless I sleep after the command.
Basically, how can I use ssh2_shell, send a command and get the response for that command regardless of how long it will take, without timing out the script by requesting content of a blocked stream after the shell has returned its contents?

What you describe is the intended behavior of ssh2_shell - it is interactive. That means you would need to work with timeouts.
If you want a command or script executed on the remote host and just receive its output you need non-interactive execution - ssh2_exec.

Related

PHP runs Perl script from command line successfully but I get no output displayed

I have a problem displaying the results of a Perl script that I am calling from my PHP webpage. The Perl script constantly monitors a socket and will display the output of this when run from the command line and also saves the output to a file. I know the Perl script is being called and running successfully as the text file is being updated but I do not get the output on the webpage as I was hoping for.
I have tried using the system(), exec(), passthru() and they all allow the Perl script to run but still with no output on the webpage so I am obviously missing something. Am I using the correct functions? Is there a parameter that I need to add to one of the above to push the output back to the webpage that calls the Perl script?
One example of what I have tried from the PHP manual pages:
<?php
exec('perl sql.pl', $output, $retval);
echo "Returned with status $retval and output:\n";
print_r($output);
?>
Edited to include output example as text instead of image as requested.
# perl sql.pl
Connecting to the PBX 192.168.99.200 on port 1752
04/07 10:04:50 4788 4788 3256739 T912 200 2004788 A2003827 A
I'm no PHP expert, but I guess that exec waits for the external program to finish executing before populating the $output and $return variables and returning.
You say that your sql.pl program "constantly monitors a socket". That sounds like it doesn't actually exit until the user closes it (perhaps with a Ctrl-C or a Ctrl-Z). So, presumably, your PHP code sits there waiting for your Perl program to exit - but it never does.
So I think there are a few approaches I'd investigate.
Does sql.pl have a command-line option that tells it to run once and then quit?
Does PHP have a way to send a Ctrl-C or Ctrl-Z to sql.pl a second or so after you've started it?
Does PHP have a way to deal with external programs that never end? Can you open a pipe to the external process and read output from it a line at a time?

Run script on other Server

I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.
Till now I am using following code on website 1.
$file = file_get_contents('Website-2/update.php');
But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.
Is there a way where I can do this in a better way or tell PHP to move to next line of code.
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.
Common queues apps to look at:
[https://aws.amazon.com/sqs/?nc2=h_m1][1]
[https://beanstalkd.github.io/][2]
[https://www.iron.io/mq][3]
[1]: https://aws.amazon.com/sqs/?nc2=h_m1
[2]: https://beanstalkd.github.io/
[3]: https://www.iron.io/mq
What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.
On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.
You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.
(Async curl request in PHP)
(https://segment.com/blog/how-to-make-async-requests-in-php/)
Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)
exec('php -f "{path to new script}.php" $args > /dev/null &');
The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.
On windows you can use COM and WScript.Shell to do the same thing
$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);
You may want to use escapeshellarg on the filename and any arguments supplied.
So it will look like this
Server1 calls Server2
Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
Server1 continues as normal
Server2 continues the background process
So using your example instead of calling:
file_get_contents('Website-2/update.php');
You will call
file_get_contents('Website-2/update_kickstart.php');
In update_kickstart.php put this code
<?php
exec('php -f "{path}update.php" > /dev/null &');
Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly
Simple...
The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.

PHP wait for exec to finish and output the result

I have a php file that takes in a couple parameters from the URL and then runs an exec command, in which i want to wait and have the results of the exec displayed. This exec command takes about 20-30 seconds to finish. It never completes because the webpage just gets an nginx 502 bad gateway error (times out).. Instead of extending nginx timeout error, as that's bad practice to have a connection hang for that long, how can i run the php's exec in the back and then have it returned on the page after it's complete?
Or is there a better way to accomplish this without using php?
Have the PHP trigger an exec to a script that is forked so it runs in the background (ends with &). The page should then return some js that periodically polls the server via ajax requests to check the original script's status. The script should output its STDOUT and STDERR to unique file(s) so that the polling script can check the status.
Edit: If you need to know the script's exit code wrap it in another script:
#/bin/bash
uniqId=$1
yourscript $uniqId &
myPid=$!
echo $myPid > $uniqId'.pid'
wait $myPid
echo $? > $uniqId'.returned'
Call like ./wrapper.sh someIdUniqueToClient &. Untested, but you get the gist.

curl not executing properly when invoked by php

So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.

Initiating background process for running php file as background from another php page

I want initiate one php page as background process from another php page.
Use popen():
$command = 'php somefile.php';
pclose(popen($command,'r'));
This launches somefile.php as a background process.
This is a technique I used to get around restrictions applied by my webhost (who limited cronjobs to 15 minutes of execution time, so my backup scripts would always timeout).
exec( 'php somefile.php | /dev/null &' );
The breakdown of this line is:
exec() - PHP reference Runs the specified command, as if from the Linux Command Line.
php somefile.php: Invokes PHP to open, and run, somefile.php. This is the same behaviour as what would happen if that file was accessed through a web browser.
| ("pipe") - Sends the output of the proceeding command to a specified target. In this instance, it would "pipe" the content which would normally be read by the web browser accessing the file.
/dev/null - A blackhole. No, not kidding. It is a place where you send output if you just want it to disappear.
& - Appending this character to the end of a Linux command means "Do not wait - Send this to the background and continue."
So, in summary, the provided code will execute a PHP script, return no output, and not wait for it to finish before continuing onto the next line.
(And, as always, if any of these assumptions on my part are in error, I would love to be corrected by more knowledgeable members of the community.)
You have to make sure, that the background process is not terminated when the processing of the page finished. If you are on a Linux system, you could try to use the nohup command:
$command = 'nohup php somefile.php';
pclose(popen($command,'r'));
If it still gets terminated, you could try the "daemon" command.

Categories