I've wracked my brains on this one for days and I've gotten nowhere.
I have a Jenkins continuous integration server that is building to a separate nginx webserver, both on AWS.
The problem
My rsync command, when run by the build server user through a PHP passthru(), is failing silently.
To move files between servers after a build, I am using a PHP script that uses rsync with ssh to connect to update the webserver. The snippet is below.
PHP:
$rsync = "rsync -avzlO --no-g -e 'ssh -i ".$identity." -o UserKnownHostsFile=/dev/null \
-o StrictHostKeyChecking=no' ".$workspace." root#".$host.":/var/www/public_html/";
echo($rsync);
echo(passthru($rsync));
Executing this command through the command line, either by a php name-of-file.php call or by simply running the echoed command succeeds in performing the rsync. However, when the command is called as a post-build step through the Jenkins server, the command fails silently.
To note: The user that I have access to is the ec2-user user. The user that Jenkins runs under is the jenkins user. I have confirmed this by running an echo(exec('whoami')); immediately before the rsync.
What I've tried
It feels like a permission or owner error. I have messed with them until my eyes are about ready to bleed.
Currently, they stand as follows:
-rw------- 1 jenkins ec2-user 1679 Feb 12 01:37 jenkins.pem
-rwxrwxr-x 1 jenkins ec2-user 4097 Feb 12 01:51 push_to_cluster_l4.php
I have tried creating a new ssh key pair, and verifying that it allows passwordless login to the webserver.
I have tried changing passthru() to exec() and system()
I've added logging of the passthru() call through an echo, piping to a file, assigning to a variable and outputting, to no avail - it fails silently each time.
I know that what I'm trying to achieve is possible, because not only does it work when performed through the shell, but the same code powers deployment to our other clusters. There is just some hangup somewhere preventing the deployment from happening on the new webserver.
Thanks for taking a look.
The issue was an incorrectly configured public SSH key on the client side. While it existed, it was not in the .ssh directory, and was not owned by the user Jenkins was using to connect for the rsync. Once I fixed that, I was able to connect without issue.
The takeaway? rsync fails silently when the SSH key file does not allow for passwordless login at the destination. Hopefully my time and frustration helps some poor soul in the future.
Related
I have set up a github webhook to talk to my webserver api (server is apache2). I securely check for the github secret using the encryption of the payload, as specified on their help page.
When a push to master is done on the repo of the web application, a script (deploy.sh) is triggered via <?php exec. If I trigger this script manually, as root, everything is perfect. But of course, the user that triggers the script on normal circumstances is www-data.
My question is what is the best practice for www-data to do a git pull of the new repo? I have mostly discarded doing exec sudo, but maybe that is the way. One problem of the many that i'm facing with making www-data trigger a git pull is that the ~/.ssh/id_rsa file is only set up for root (when building the server image on docker). Its a read-only ssh-key.
This is a legacy application so what really worries me is that through some php exploit someone could do the exec without being github. And from there escalate to get read access to the repo or something worse.
The question is really, what is the best practice to update a web application using a webhook
Solution was allowing www-data to sudo only the deploy command:
echo 'www-data ALL=(ALL) NOPASSWD: /var/my-cool-scripts/deploy.sh' | sudo EDITOR='tee -a' visudo
on php:
exec('sudo -n /var/my-cool-scripts/deploy.sh')
PS: actually used this neat trick to know the execution was okay
$did_the_script_run_okay = exec('sudo -n /var/my-cool-scripts/deploy.sh') == "okay" || false;
last line of deploy.sh:
echo "okay"
the exec command returns the last line echoed by the command, so i check that to ensure complete execution
I am trying to rsync file from local to remote server.
When i do this on console it works:
rsync -avzhe ssh /var/www/folder1/file5
root#192.168.56.74:/var/www/folder2
but when i do this on my php and run the php script, it doesn't work:
$rysncCommand = "rsync -avzhe ssh /var/www/folder1/file5 root#192.168.56.74:/var/www/folder2";
shell_exec($rysncCommand);
There is no error shown, so i can't really tell what is the error. Is there something wrong with my php script?
First, you need to check if you need to be a root or (sudo user) for running rsync.
If yes then exec() command will only work if it is run by same user on php-cli (not on browser by Apache user). i.e. Which user you are loggined into shell for run rsync.
If it is root or any elavated permission user with sudo permission then, This rsync command may not be available to apache/www-data user which is working when php script run from browser.
So You try to make a normal user and login through it, Then try rsync if you are successful then it may be interesting to see what are other problems can be, But if you getting access/permission denied then obviously you can not run this script at-least on browser.
Besides this One more thing permission may not be directly related to rsync command itself but with folder /etc/test/ which is owned by root user in normal scenario.
For more details you can check this Stack Overflow Link .
I am using opencv for initiating the camera on my arch linux. Its getting initiated and works well when I actually do it from the command line on the server itself.
I want to initialize it using php. I tried doing it using shell_exec from PHP.
My PHP file looks like:
<?php
$output=shell_exec('LD_LIBRARY_PATH=usr/local/lib ./a.out 0 2>&1 1>/dev/null');
echo $output;
?>
It gives this output:
ERROR: capture is NULL
I am running this through my windows web browser as a client and the opencv and the related files are on the server that is my arch linux.
I want to start the camera and capture images when I run this php file from the windows web browser, but when executed it throws the error as mentioned.
While this may work when you are SSHed into your server. The webserver user is most likely different than the user you login as. Popular user ids/groups that webservers run as on Linux machines are http, www-data, nobody, and others.
From this point you have two options.
You can make sure the script you are trying to run from PHP (and all of it's children, if any) is able to be run by the webserver user.
You can modify your /etc/sudoers file which gives the webserver user access to elevate permissions for that script only. (NOTE: This potentially opens up security holes so be careful).
To find out what user your webserver runs as execute this: ps aux
Take a look at the output and the first column in the output lists the user that that process is running at. Here's an excerpt of my webserver (nginx) on one of my boxes:
www-data 26852 0.0 0.0 29768 3840 ? S Jun04 0:50 nginx: worker process
You can see that nginx runs with the user www-data here. You can also execute the command with grep to help you find the process quicker. Grep will only show you those lines which match what you send to it. Here's an example: ps aux | grep nginx
Ok now that you know what user the webserver is running as, let's try giving that user access to the script. Let's say your script is foo and is located in /usr/local/bin. You would do the following commands:
chown www-data /usr/local/bin/foo
After changing ownership on the file try to rerun your command again from your PHP page and see if it works.
For completeness I also said you could give your webserver user sudo privileges to that file. To do that you would need to append the following line to the bottom of your /etc/sudoers file:
www-data ALL= NOPASSWD: /usr/local/bin/foo
Then your shell_exec command could switch to this:
shell_exec('sudo /usr/local/bin/foo');
Please keep in mind that doing this would allow your webserver user to run the script as root which is incredibly dangerous in a variety of situations. However, the camera may require elevated permissions to work. I'm not sure what the permissions requirements are on the camera setup you are trying to invoke.
Good luck. Hope this helps!
I'm trying to create a PHP web interface for a staging->production publishing script. The web interface is secure (intranet,passworded etc) so I am happy to for apache act as the root user to perform the rsync. There is no password for the root user, a keyfile is used for SSH access.
I have tried sudo-ing the rysnc command in the shell script...
sudo rsync --verbose --recursive --times --perms --links --delete /tmp/dir1/* /tmp/dir2/
And allowing apache to run rsync by adding the following to the sudoers file...
apache ALL=(ALL) NOPASSWD:/usr/bin/rsync
I am using PHP shell_exec to invoke the script...
$result = shell_exec('bash /tmp/syncscript.sh 2>&1');
I get the following error...
sorry, you must have a tty to run sudo
How can I setup so I can run the rsync command as though I were the root user?
Thanks!
You might want to try what i wrote in this post if you want to use SUDO from PHP. It solved all my problems and i have a similar setup to yours, i have a DEV server internally but i want it to do things that only root can...
How to use exec() in php in combination with ssh 'hostname' command?
Good luck
In the end went for a non-sudo approach to this problem. Used phpseclib to get a connection to the box as a user that could do what I needed (not apache). And then made sure that the dirs/files that were being targetted in the rsync operation were accessible to this user.
Seems simple now I look back on it.
I can run an svn command from the command line but not from within a PHP script. Significantly, I can run the PHP script on my Mac and it returns the expected data just fine but when I upload it to my Linux server it won't work (from within PHP... I can run the svn command from the terminal). I'm pretty sure this is a user or permission issue of some sort.
I can run (from command line):
svn log http://whatever.com/svn/foo
but none of the following work (run separately... not all together like this):
exec('svn log http://whatever.com/svn/foo');
exec('svn log http://whatever.com/svn/foo',$out);
exec('/usr/bin/svn log http://whatever.com/svn/foo');
However this works:
exec('ls');
I assume the problem is that when I run from the command line I am running as root whereas when I run from PHP I am running as the apache user (www-data)? Perhaps? Any suggestions on how to be able to run exec('svn log http://whatever.com/svn/foo');?
Changing permissions to 777 (just trying to get it working!) does not help.
Here are a couple of threads that I think might help:
Thread 1 (read as there is more):
$cmd = '/usr/bin/svn list --config-dir /some/place file:///var/subversion/devfoundry/ 2>&1';
exec($cmd, $output);
$output = implode("\n", $output) . "\n";
echo $output;
Thread 2:
The Subversion error "svn: Can't
recode string" can be caused by the
locale being wrong. Try
<?php
putenv('LANG=en_US.UTF-8');
?>
(or whatever your preferred locale is)
before you call shell_exec()
Thread 3: PHP Interactive Shell
May be you can use a svn client for php. Here is a good one
http://code.google.com/p/phpsvnclient/
When you run Subversion from the command line, you are running it as yourself. That is, you are the user logged in and running the command.
If you are running Php from a webpage, it is the user who is running the Apache httpd daemon (which could be "apache", "www", "runwww", etc. depending upon the platform). The user running the PHP script may not have read/write permissions to the Subversion repository.
You have two ways of solving this:
Provide your program with user credentials via the --username and --password command line parameters.
Setup the user running httpd with Subversion credentials. Once it is done, it'll never have to be done again. This way, your PHP code doesn't contain login credentials.