So I have 5 cloud machines running and my 1st one is set up as a apache server. My goal is for the users to click on the submit button on the webpage, it will run parallel-ssh on my 1st machine and launch a script on the other cloud machines. I have the webpage running, the script running and this is my attempt to launch parallel-ssh from index.php
So "master.txt" is on the 1st cloud machine that holds on the info about the other cloud machines. StrictHostKeyChecking is used to overlook the security checks. And the perl command is what will be launched on all the cloud machines. I know this question is fairly confusing but I'm new to both php and perl and I really need an answer for this project. Sorry, it's one big command but I had to break them into lines because it wouldn't display right on here.
Maybe you'd have an easier time if you connected to each server through libssh2 or phpseclib and ran commands on each of the machines like that?
This is a big command.
Might I suggest this instead so it can be better understood.
In the following code, I use escapeshellarg a lot to make sure all our shell arguments are properly escaped and are not open to attack. This also depends on whether or not your variables are trusted, but shouldn't hurt unless each argument variable is actually composed of multiple arguments or other not so common things.
<?php
$result = shell_exec(
'parallel-ssh -h master.txt "-O StrictHostKeyChecking=no" ' . // SSH command
'-t 5 ' . // 5 second timeout on each host
'-l divx ' . // User
'-i ' . // Inline mode used for troubleshooting.. Take this out once it works.
'-P ' . // Print the output. This will only return it so it is stored in $result
escapeshellarg(
'perl /mnt/nas-storage/EncoderSetup/commTools/con.pl ' . // Executes a perl file
escapeshellarg($input) . ' ' . // $input arg to perl command
escapeshellarg($output) . ' ' . // $output arg to perl command
escapeshellarg($intraperiod) . ' ' . // $intraperiod arg to perl command
escapeshellarg($res) . ' ' . // $res arg to perl command
escapeshellarg($qp) . ' ' . // $qp arg to perl command
escapeshellarg($framerate) . ' ' . // $framerate arg to perl command
escapeshellarg($startframe) . ' ' . // $startframe arg to perl command
escapeshellarg($numgop) . ' ' . // $numgop arg to perl command
escapeshellarg($enc) . ' ' . // $enc arg to perl command
escapeshellarg($cfg) . ' ' . // $cfg arg to perl command
escapeshellarg($sao) . ' ' . // $sao arg to perl command
escapeshellarg($wafrosync) . ' ' . // $wafrosync arg to perl command
escapeshellarg($amp) . ' ' . // $amp arg to perl command
escapeshellarg($tmvp) . ' ' . // $tmvp arg to perl command
escapeshellarg($transkp) . ' ' . // $transkp arg to perl command
escapeshellarg($fasttranskp) . ' ' . // $fasttranskp arg to perl command
escapeshellarg($goploc) // $goploc arg to perl command
)
);
print $result;
This should work for you but there are some things to consider.
First, execute it and print out the $result to see what the actual output is. If you get something like
[FAILURE] server.hostname Exited with error code 255
Then it is possible that pssh is asking for a password for each host. I notices that you are using the -A option which asks for a password. You can't do that with shell_exec in php because then the script will hang and wait forever for a password. Instead, you need to setup SSH keys so that your first cloud server can ssh into the other cloud servers without a password. Setting up SSH public key based authentication actually is pretty easy. But not if you've never done it before. I'm sure there are plenty of posts on how to set that up. The procedure is basically:
Generate a public and private key (No passphrase).
Type in this command at your first cloud server: ssh-keygen
Don't enter a passphrase when it asks you
Copy the id_rsa.pub file to the ~/.ssh/authorized_keys file on each of the secondary cloud servers
Make sure the .ssh folder has 700 permissions on each of the cloud servers
Make sure the .ssh/authorized_keys file has 600 permissions on each of the cloud servers.
If all went as planned, you should be able to execute commands on each of the cloud servers from your main cloud server securely and without a password. Now, you can just run your command and it should work.... or at least give you output as to why it didn't so you can continue to troubleshoot.
Another concern is the user that shell_exec is run as. If you are running a web server on your main cloud server, then you will have to make sure that the current user (usually apache) has the id_rsa file in the .ssh folder wherever your apache home directory is (usually /var/www/). So you would put the id_rsa file in the /var/www/.ssh/ folder and make sure it is owned by apache. Also, make sure it is chmod 600 to protect it.
There are other security concerns like protecting your id_rsa file. Don't run any untrusted scripts on your server or use any virtual hosts with users that upload their own files for their own websites. The security concern comes into play because any script that is run as apache can easily access, and compromise your id_rsa file... yikes. Anyone who has access to this file will easily gain access to each of your cloud servers... so protecting it should not be taken lightly.
Related
I have a python script which is running neural networks based on keras, it runs fine but when i use to run that script through shell_exec in my php and use var_dump i get "NULL".
I have already seen many solutions. I have increased max_execution_time and max_input time in php.ini to 5000. I am also giving full paths of files and required environment in command. Also i have given all permissions to folders (lampp, htdocs, (my php file folder)) This is the command which gets finally called in shell_exec.
source /home/characterleveldl/anaconda3/bin/activate /home/characterleveldl/anaconda3/envs/spyderr && python fyp_final.py 5 /opt/lampp/htdocs/charlevel_fyp/train.csv /opt/lampp/htdocs/charlevel_fyp/test.csv 86save_true.pdf 86save_random.pdf 86save_result.txt 2>&1
fyp_final.py , train.csv, test.csv everything exists in same folder as the php file. I ran the same exact command via terminal in root mode and it ran without any error. But in below php code it returns NULL.
$cmd_initial ='source /home/characterleveldl/anaconda3/bin/activate /home/characterleveldl/anaconda3/envs/spyderr && ';
$cmd = $cmd_initial . 'python fyp_final.py ' . $num_classes . ' ' . $train . ' ' . $test . ' ' . $save_true . ' ' . $save_random . ' ' . $save_text . ' 2>&1';
$command = escapeshellcmd($cmd);
$output = shell_exec($command);
var_dump($cmd);
var_dump($output);
Please help me. If i am doing something wrong or is it even possible to run machine learning code from php since i tried running small codes they run perfectly.
I figured it out. I solved it via putting my code inside a shell script and then executing this script via shell_exec. It now works fine.
I am trying to make a simple program in Laravel that would allow the user to click a button and a backup SQL for their database would be generated locally.
$command = "/usr/local/mysql/bin/mysqldump --user=" . env('DB_USERNAME') ." --password=" . env('DB_PASSWORD') . " --host=" . env('DB_HOST') . " " . env('DB_DATABASE') . " > " . storage_path() . "/" . $filename;
$returnVar = NULL;
$output = NULL;
exec($command, $output, $returnVar);
This is my current code and as you can see I have specified the entire path of mysqldump, if I don't it just returns an empty .sql. The same command (without the entire path) runs flawlessly when I run it in terminal, now this code is going to run on locally different environments (Windows, Linux, Mac) and the path to mysqldump would be different in each.
Is there a way I can make this happen without specifying the entire path?
You could use another env variable with a default to 'mysqldump'.
However, you shouldn't use env variables directly in your code, load them into config parameters in your config files. env() won't work once the config is cached.
The underlying problem, however, is that /usr/local/mysql/bin is not in your PATH variable for the web user. PHP exec $PATH variable missing elements
My friend has some sort of admin panel where he displays information from several servers. He is using cPanel XML api to display several things for example the server load average, the code looks something like this:
include("xmlapi.php");
$conf = simplexml_load_file("servers.xml");
foreach ( $conf->server as $server ) {
$xmlapi = new xmlapi( $server->ip );
$xmlapi->set_debug(1);
$xmlapi->hash_auth( $server->user, $server->accesshash);
$loadavg = $xmlapi->loadavg();
print $server->ip . " - " . $loadavg->one . " - " . $loadavg->five . " - " . $loadavg->fifteen;
}
I've been reading the wiki but the closer thing I found is retrieving an account disk usage. But I need the info that comes from the "df" command ran in ssh as root. What would be the easier and safest way to do this? Since i need to get the df command output from several servers and not from the one running the code.
Why do you need to run df as "root"? I can certainly run it as myself on my Linux machine?
(Yes, that's a question, not an answer)
Assuming you have full access to the machine, you could probably set up a useraccount "df", that is listed in sudoers with !authenticate attribute (so it can sudo without password), and then ssh df#${machine} sudo df
You can also restrict sudo so that the df" user can only do df, and not, for example rm -rf / or adduser blah
And of course set up your ssh to use a public/private key challeng rather than using your php code to have a password in the source - if nothing else because it makes it more flexible, and you don't have to update the source if you decide to change the password.
I am trying to write a php script to add users to an LDAP. To this end I have a shell script, aptly titled "addldapuser.sh", that will take in user/screenname/password combinations and add them to the LDAP. This script works properly. We would like users to be able to enter their information on a web form, which will then invoke said script and add the user to the LDAP properly. The addldapuser script currently needs to be run as root, and while I am aware of the security concerns, they can be dealt with later. I have tried everything I can think of or find, gave everything every permission I could think of, gave everything I could root, mucked around with Apache for awhile, and now I am out of ideas.
$scrName = $_POST['screenname'];
$usrName = $_POST['username'];
$pass = $_POST['password'];
if (isset($_POST['submit']))
{
$out = nl2br(shell_exec("sudo /EasyLDAP/old_scripts/addldapuser.sh " . $usrName . " " . $scrName . " " . $pass));
}
Once again, I know this is just a terrible, terrible idea that is guaranteed to end in hackers destroying us all forever, but my boss wants me to at least make it function this way.
I do know that we should at least sanitize the input, and that will be taken care of in due time.
I recently published a project that allows PHP to obtain and interact with a real Bash shell, you can easily get a shell with root. Get it here: https://github.com/merlinthemagic/MTS
After downloading you would simply use the following code:
$shell = \MTS\Factories::getDevices()->getLocalHost()->getShell('bash', true);
$strCmd = "/EasyLDAP/old_scripts/addldapuser.sh " . $usrName . " " . $scrName . " " . $pass;
$return1 = $shell->exeCmd($strCmd);
echo $return1;// return from your script
I would recommend creation of a daemon, which will be running as a root and will communicate with a script running at a user level.
You can do very simple python rpc server (e.g. XML-RPC which is built-in) and a client as a glue, run one of them as a server and root and the other one make into a client script.
The php code would then execute the python script with the required parameters, which then can communicate with the python server script.
As a benefit, you get potential security if you do the server part well. I chose python as a language which has most of the functionality built-in and is very simple to use.
Examples:
server - http://docs.python.org/library/simplexmlrpcserver.html#simplexmlrpcserver-example
client - http://docs.python.org/library/xmlrpclib.html#example-of-client-usage
Alternatively, if you insist on using php, you can run the server process as a php daemon and connect to it via some similar RPC means.
I'm building a web control that will let our junior IT staff manage firmware on our LifeSize phones. Currently we do this by uploading the new firmware onto a central server, then running this command for each phone we want to upgrade
cat new_firmware.cramfs | ssh -T cli#1.1.1.1 "upgrade all"
This asks me for the password, then uploads the firmware. Works like a champ, but it takes someone comfortable with CLI tools, SSH access to this server, and patience to look up all the IPs of all the phones.
It looks like we're stuck with a password logon, testing with certificates has been disastrous. The device being acted on is not a full-fledged computer, it's a telephone running a tiny, proprietary embedded OS.
I'm working on a PHP script that can iterate over all the phones, but basically duplicate that function. This is what I have so far:
<?php
$firmware_filename = "new_firmware.cramfs";
$firmware_stream = fopen($firmware_filename,"rb");
$ssh_connection = ssh2_connect("1.1.1.1", 22);
ssh2_auth_password($ssh_connection, "cli", "password");
$ssh_stream = ssh2_exec($ssh_connection,'upgrade all');
$written = stream_copy_to_stream($firmware_stream,$ssh_stream,-1);
if($written != filesize($full_filename)){
echo "The file is " . filesize($firmware_filename) . " bytes, I only wrote $written" . PHP_EOL;
}else{
echo "All Good" . PHP_EOL;
}
?>
But this always returns
The file is 26988590 bytes, I only wrote 8192
And the upgrade does not proceed correctly.
Well you could simply call
system('cat new_firmware.cramfs | ssh -T cli#1.1.1.1 "upgrade all"');
and then replace using your vars:
system('cat ' . $firmware . ' | ssh -T ' . $username . '#' . $host . ' "upgrade all"');
is this a solution for you?
you can automate the ssh-access by placing the certificate-file into .ssh-directory. Read about SSH login without password.
regards
There are several things you could try:
Copy the file first, then run the
command on the now-local file.
Assuming that you're filling an 8k buffer, try writing in a loop until you've successfully written the whole file
Take the easy way out, and just set up ssh keys so you don't need to enter a password, and exec shell commands directly from your script