I'm trying to execute with PHP a command (rsync) to copy folders and files from a remote server to a local folder.
This is the code I wrote in php. Command WORKS in SSH (local Terminal and remote with putty.exe), copying correctly the folders and the files.
But it doesn't work in PHP. What can I do? Do you know a better(secure/optimal) way to do this?
exec("echo superuserpassword | sudo -S sshpass -p 'sshremoteserverpassword' rsync -rvogp --chmod=ugo=rwX --chown=ftpuser:ftpuser -e ssh remoteserveruser#remoteserver.com:/path/files/folder /opt/lampp/htdocs/dowloadedfiles/", $output, $exit_code);
EDIT:
I had read this guide to create a link between my server and my local machine.
Now I can login with ssh in my remote machine without password.
I changed my command:
rsync -crahvP --chmod=ugo=rwX --chown=ftpuser:ftpuser remote.com:/path/to/remote/files /path/to/local/files/
This command works too in terminal, but when I send it with exec php command, it fails again, but I got another different error: 127.
As MarcoS told in his answer, I checked the error_log.
The messages are this:
ssh: relocation error: ssh: symbol EVP_des_cbc, version OPENSSL_1.0.0 not defined in file libcrypto.so.1.0.0 with link time reference
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: remote command not found (code 127) at io.c(226) [Receiver=3.1.1]
Well, after lot of try/error, I finished to cut the problem in the root:
I readed this guide (like the last one, but better explained) and I changed the php file that execute the rsync command to the remote server (where files are located) and run the rsync.php file there, and it worked perfectly.
To execute in the machine with the files (the files to copy and the rsync.php)
1.- ssh-keygen generates keys
ssh-keygen
Enter an empty passphrase and repeat empty passphrase again.
2.- ssh-copy-id copies public key to remote host
ssh-copy-id -i ~/.ssh/id_rsa.pub remoteserveraddressip(xxx.xxx.xxx.xxx)
The rsync.php file:
exec("rsync -crahvP /path/in/local/files/foldertocopy remoteuser#remoteserveraddress:/path/in/remote/destinationfolder/", $output, $exit_code);
After all of that, navigate to the rsync.php file and all must work. At least worked for me...
I suppose you are experiencing identity problems... :-)
On a cli, you are running the command as the logged-in user.
On PHP, you are running the command as the user your web server runs as (for example, apache often runs as www-data, or apache user...).
One possible solution I see (if the above is the problem real cause), is to add your user to web-server group...
I'd also suggest you to check the web-server error logs, to be sure about the real cause of the problem... :-)
Related
I have a C program that I wrote called convert3to5, originally written for CentOS / Fedora 32bit system in early 2010. I am moving it to new CentOS 6.x 64bit system host.
From a CentOS Putty console I can run the convert3to5 command just fine; here is a sample of it running from my console:
[root#cloud convert3to5]# ls
CircleStar convert3to5 Convert3To5.txt test.tif
[root#cloud convert3to5]# ./convert3to5 /var/www/webadmin/data/www/mydomain.com/uploads/SV-DIS160217B.tif
TIFFReadDirectory: Warning, /var/www/webadmin/data/www/mydomain.com/uploads/SV-DIS160217B.tif: wrong data type 7 for "RichTIFFIPTC"; tag ignored. Image has an undefined fillorder - using default: MSB2LSB
The above is a normal completion of convert3to5 and I get a SV-DIS160217B.bmp that is placed in /var/www/webadmin/data/www/mydomain.com/uploads/ So running it from console works fine.
Question - I am attempting to run the same exact command from PHP using the exec(command, output, return) command as follows:
chdir($sv_path.$c3to5_path); //change our working directory to "/convert3to5" directory
$command = "./convert3to5 $targetFile 2>&1";
$result = exec($command, $output, $return);
// the output of the above command - is a .bmp file it will be placed in the same path as the input .tif file
I get the following $result:
ERROR: Unable to convert
/var/www/webadmin/data/www/mydomain.com/uploads/SV-DIS160217B.tif to 5
color BMP file: Open file Error: Tiff_3_to_BMP_5_.lut!
My convert3to5 does need to open Tiff_3_to_BMP_5_.lut
Why does it find Tiff_3_to_BMP_5_.lut when I run convert3to5 from a console prompt but not from PHP exec(...) in both cases my pwd shows that I am in
[root#cloud convert3to5]# pwd
/var/www/webadmin/data/www/mydomain.com/myView/convert3to5
I have also verified pwd is correct from my PHP script after the
chdir($sv_path.$c3to5_path);
Tiff_3_to_BMP_5_.lut is in CircleStar directory - the path to CircleStar is /var/www/webadmin/data/www/mydomain.com/myView/convert3to5/CircleStar
Summary: ./convert3to5 works while PHP exec('convert3to5 ..) does not appear to work.
Can anyone suggest the difference and how to fix and/or debug?
Thanks
You're running the console from the convert3to5 directory, and I suspect your old C program used a relative path to the .lut file, possible relative to the .tif?
What if in the console example you did
cd ../..
./path/to/convert3to5/convert3to5 /var/www/webadmin/data/www/mydomain.com/uploads/SV-DIS160217B.tif
Might be related to $targetFile. Print that and see if it's the full path.
Finally, run
/full/path/to/convert3to5 fullTargetPath
If that works, then as a workaround, if you just do exec('/full/path/to/convert3to5 $fullTargetPath, ..) it should behave like the console.
Per my above comment to wonton:
From the console I was running as root (so fully privileged). I supposed my PHP script will run as the "apache" user on the server?
Here was the problem I believe: I looked at the CircleStar directory privileges where the Tiff_3_to_BMP_5_.lut file exists. CircleStar had rw-r--r-- (0644) when running as root from console this allowed my convert3to5 program to find and open Tiff_3_to_BMP_5_.lut file just fine. However not the PHP exec(...) once I changed the privilege on CircleStar to rwxr-xr-x (0755) PHP exec(...) ran fine!
So ultimately it was a permission issue.
I'm trying to create a temporary tunnel via php so I can query a remote database.
The following code works through php-cli and as a shell command, but it doesn't seem to do anything when I run it trough apache:
$connect = "ssh -i remotekey -f -L 3315:localhost:3306 user#<remote IP> sleep 20 >> /tmp/logfile";
$out = shell_exec($connect);
A few notes:
remotekey is owned by wwwrun (the apache user under openSuse), perms are at 600
The logfile in /tmp gets created (and is blank)
safe_mode is off
Using PHP 5.3.17
After opening the site, I check the running processes for the background ssh and get nothing.
If I run it through php-cli, I see the tunnel running.
This has been driving me crazy. Any help would be greatly appreciated.
UPDATE
The issue was with the command silently failing as the apache user due to the remote server not being in the known_hosts file for the apache user.
Running the command with:
-o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no
circumvented this and the tunnel now works.
Thanks to the helpful folks who pointed me in the right direction in the comments.
I have a C program that makes a system call (centOS 6.0) to encrypt a file, my code is:
#include <stdlib.h>
int main () {
system ("gpg -c --batch --passphrase mypass file.txt");
return 0;
}
The executable object is called encrypt_file
When I run ./encrypt_file directly through CLI it runs perfectly I obtain my file.txt.gpg, but when I try to execute it via browser I get no response.
Code in php:
shell_exec("./encrypt_file");
The reason I chose to make a c program is that I need the passphrase to be in the code but not visible, when I delete the .c file that contains the passphrase all I have left is my .exe and no visible passphrase.
I already changed permissions to apache user by issuing the following:
chown apache.apache /var/www/html/
And added the next line in /etc/sudoers:
apache ALL=(ALL) NOPASSWD:ALL
NOTE: The only command I have issues is gpg, I can make a system call with any other command that I needed to use, I can even run python scripts, and other C programs that doesn't contain anything related to gpg.
I hope a fast reply! I need to use a lot this encrypt_file!
Checking the error_log in /var/log/httpd/error_log I saw this line:
gpg: Fatal: can't create directory `/var/www/.gnupg': Permission denied
Then I found a solution at this site -> http://gnupg.10057.n7.nabble.com/Exi...pt-td7342.html
I added the --homedir option with the PATH that I found in the error.log of apache to the gpg command and it works perfectly!
Thanks to all!
I have a Samba share from Windows network mounted to a directory on my Linux based Webserver. I have mounted the directory as follows:
mount -t cifs -o username=admin,password='passsword',domain=mydomain.local,file_mode=0644,dir_mode=0777,uid=client_user,gid=client_user '//192.168.0.x/d$' /home/client_user/mnt
The mount works and I can browse through the files and directories in the OS. However, I wish to be able to access this through a PHP script ran from the browser. However, any file operations on the share result in a permission denied error. I have experimented a little and replaced the uid and gid parameter values with apache, but still no luck.
Any suggestions are much appreciated
Edit
In further tests I have created a file with the following code:
if(is_readable('/path/to/mnt')) {
echo 'Readable';
}
else {
echo 'Not';
}
Running this from the command line on the server results in Readable being printed. I have ran this as root and as a user on the server, but it will not work from the browser.
So after some trial and error I worked out that SELinux was not permitting httpd access to the folders.
Running this command allows httpd to access cifs:
setsebool -P httpd_use_cifs on
However, further investigation revealed that I could set the httpd context on just the mounted folder. So I unmounted the drive and amended my mount command to include:
context="system_u:object_r:httpd_sys_rw_content_t:s0",
The full command:
mount -t cifs -o context="system_u:object_r:httpd_sys_rw_content_t:s0",username=admin,password='passsword',domain=mydomain.local,file_mode=0644,dir_mode=0777,uid=client_user,gid=client_user '//192.168.0.x/d$' /home/client_user/mnt
I have been going nuts with this..
I have gnupg installed on my CentOS server and I try to encrypt uploaded files (uploaded via a PHP page). On the server via the command line, it works perfectly. But via the php script, it fails with this error:
gpg: /path-to-my-file/my-file: encryption failed: file open error
The user apache (which I think is used to run the exec command) has read/write in the directory of the file.
The file is uploaded fine (I can see it afterward as I removed the deletion of the unencrypted file from my code) and can be deleted correctly via the php site.
the command I run is the following
/path-to-gpg/gpg --homedir=/path-to-my-home-gnupg/.gnupg -e -r therecipient#email the-unencrypted-file
Any idea how I could tackle this?
thanks
A few things to check:
Run system("ls " . escapeshellarg($file)) and check the result — is it file not found? Permission denied? That will help you debug.
Run system("whoami") to make sure PHP is running as who you think it is.
Run echo "<pre>ls " . escapeshellarg($file) . "</pre>" then copy+paste the command and run it from the shell to make sure that the path to the file is what you expected it to be.
Also, I believe CentOS runs SELinux by default… If you've got it installed, check the logs (in /var/log/) to see if SELinux is preventing Apache from executing GPG.
Try running the command with actual apache user privileges in verbose mode:
su apache -c /path-to-gpg/gpg -vv ...