I have setup 2 Ubuntu machines: 192.168.1.104 & 192.168.1.105 have installed ssh on both the machines generated ssh-keygen on 104 machine and added key to both the ip-addresses.
I want to copy file from one 192.168.1.104 to 192.168.1.105 through php.
I tried this command scp /home/tejas/hadoop/conf/core-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/core-site.xml
through shell script the file gets copied perfectly but when I run the same command through php-script
<?php
$output = shell_exec('scp /home/tejas/hadoop/conf/core-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/core-site.xml');
?>
It doesnot show any error but file doesnot get copied. Also tried similar with exec() and also tried rysnc instead of scp rsync -avzh /home/tejas/hadoop/conf/mapred-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/mapred-site.xml still no luck.
both the commands are working perfectly through shell script but not working through php
I checked php is not in safe-mode and shell_exec() or exec() is not disabled in php.ini
exec() and shell_exec() get executed by the user running the php script (usually www-data on Ubuntu but could be apache or something else). It's likely that this user does not have permission on the files/folders. One solution is to create a new user Group and add the user (www-data) to this Group, then set the proper ownership/permissions on the files/folders to copy/be copied to.
Related
I'm trying to make run a SMS sending python script from PHP.
SendSMS.py:
#!/usr/bin/env python
import gammu
sm=gammu.StateMachine()
sm.ReadConfig()
sm.Init()
message={
'SMSC':{'Location':1},
'Text':'blah blah bllah',
'Number':'xxxxxxxxx
}
sm.SendSMS(message)
when i run it from the terminal with "sudo" it doesnt work.But works fine without "sudo"
the error:
gammu.ERR_DEVICENOTEXIST: {'Text': u"Error opening device, it doesn't
exist.", 'Code': 4, 'Where': 'Init'}
I want to run the SMS script from a php script using shell_exec(). The problems are:
I cant run the SMS script with sudo
I cant run it through php without sudo
Please tell me how to fix this
Device-Raspberry pi 3
OS- Raspbian
Most likely it doesn't find the configuration file, by default it's searched in user home directory which is different when executed through sudo.
You can specify path to configuration file on command line however it's better to not execute gammu as root and configure the device to be accessible by user.
I'm trying to make a hook on bitbucket, that executes a php file, and this file executes the pull command:
shell_exec('/usr/local/cpanel/3rdparty/bin/git pull');
The pull command works fine on the SSH console, but the PHP returns the error:
Permission denied (publickey). fatal: Could not read from remote
repository.
Please make sure you have the correct access rights and the repository
exists.
The command --version shows the path to git is right, whoiami returns the same user on both, so I don't know if it is a permission issue.
What can be going wrong?
Edit: An additional issue: the alias I added for git don't work on PHP, only the full path as above. Via terminal it works just fine. Maybe it's the same reason why the key don't work in php.
Edit 2: $PATH is different on both.
When you run this command within a PHP script you are not running the command as yourself:
shell_exec('/usr/local/cpanel/3rdparty/bin/git pull');
The reason it works from the terminal console is you run the command as yourself from the console. But on a web server, you are not the user running the command. Remember: When you run PHP on a web server, it is a an Apache module. Meaning the web server user—which could be www-data, root or even apache on some systems—is running the PHP script which then runs the shell_exec command.
So it would never work as you have it setup. Perhaps you can kludge something together that would allow a key-pair to be used by the web server for these purposes, but that seems like a security risk waiting to happen.
I am trying to do simple web service in PHP (I use Laravel) to restart and shutdown my raspberry pi.
I tried in my PHP service call something like this:
exec("putty -ssh pi#192.168.0.12 -pw myPassword -m D:\workspace\CPS\public\ssh\restart.txt");
In my restart.txt I have simple command sudo shutdown -r now
but when I call my web service in Advanced Rest Client Application (Chrome addon), my request is processing and never ending. In command line this works properly.
I thought that the problem is with running PuTTy directly in PHP and I do some changes and create .bat file with the following content:
echo off
putty -ssh pi#%1 -pw %2 -m %3
I start this batch script like this:
exec("D:\workspace\CPS\public\ssh\restart.bat 192.168.0.12 myPassword D:\workspace\CPS\public\ssh\restart.txt");
...but result is the same. I changed my web service to GET method and call service in the browser like page but nothing change.
I haven't any error/exception in log files and in console. I also checked the system log but nothing found.
Am I doing something wrong or missed something? It is possible or there is a better way to achieve this?
Currently I run it on Windows 7 and XAMPP. I will do second version for Linux in future and decide in code which command should I run depends on current environment.
Update
I forgot to wrote, all required users have permissions to execute files in my (...)/ssh directory
Try to use:
echo system("D:\workspace\CPS\public\ssh\restart.bat 192.168.0.12 myPassword D:\workspace\CPS\public\ssh\restart.txt");
to get output. May be you find something error.
And you need to check safe_mode options in your php.ini file (like disable_functions and safe_mode_exec_dir).
And try to use FULL path to your putty.exe file. Webserver work's in another directory, than your bat file.
I am using opencv for initiating the camera on my arch linux. Its getting initiated and works well when I actually do it from the command line on the server itself.
I want to initialize it using php. I tried doing it using shell_exec from PHP.
My PHP file looks like:
<?php
$output=shell_exec('LD_LIBRARY_PATH=usr/local/lib ./a.out 0 2>&1 1>/dev/null');
echo $output;
?>
It gives this output:
ERROR: capture is NULL
I am running this through my windows web browser as a client and the opencv and the related files are on the server that is my arch linux.
I want to start the camera and capture images when I run this php file from the windows web browser, but when executed it throws the error as mentioned.
While this may work when you are SSHed into your server. The webserver user is most likely different than the user you login as. Popular user ids/groups that webservers run as on Linux machines are http, www-data, nobody, and others.
From this point you have two options.
You can make sure the script you are trying to run from PHP (and all of it's children, if any) is able to be run by the webserver user.
You can modify your /etc/sudoers file which gives the webserver user access to elevate permissions for that script only. (NOTE: This potentially opens up security holes so be careful).
To find out what user your webserver runs as execute this: ps aux
Take a look at the output and the first column in the output lists the user that that process is running at. Here's an excerpt of my webserver (nginx) on one of my boxes:
www-data 26852 0.0 0.0 29768 3840 ? S Jun04 0:50 nginx: worker process
You can see that nginx runs with the user www-data here. You can also execute the command with grep to help you find the process quicker. Grep will only show you those lines which match what you send to it. Here's an example: ps aux | grep nginx
Ok now that you know what user the webserver is running as, let's try giving that user access to the script. Let's say your script is foo and is located in /usr/local/bin. You would do the following commands:
chown www-data /usr/local/bin/foo
After changing ownership on the file try to rerun your command again from your PHP page and see if it works.
For completeness I also said you could give your webserver user sudo privileges to that file. To do that you would need to append the following line to the bottom of your /etc/sudoers file:
www-data ALL= NOPASSWD: /usr/local/bin/foo
Then your shell_exec command could switch to this:
shell_exec('sudo /usr/local/bin/foo');
Please keep in mind that doing this would allow your webserver user to run the script as root which is incredibly dangerous in a variety of situations. However, the camera may require elevated permissions to work. I'm not sure what the permissions requirements are on the camera setup you are trying to invoke.
Good luck. Hope this helps!
Here's my goal :
I have a Windows XP PC with all the source code in it and a development database.
Let's call it "pc.dev.XP".
I have a destination computer that runs Linux.
Let's call it "pc.demo.Linux".
Here's what I've done on "pc.dev.XP" (just so you get the context) :
installed all cygwin stuff
created a valid rsa key and put it on the dest
backup computer so that ssh doesn't
ask for a password
rsync works pretty well this way
If i try to do this on "pc.dev.XP" via a command line :
cd \cygwin\bin
ssh Fred#pc.demo.Linux "cd /var/www && ls -al"
this works perfectly without asking a password
Now here's what I want to do on the "pc.dev.XP":
launch a php script that extract the dev. database into a sql file
zip this file
transfer it via ftp to the "pc.demo.Linux"
log to the "pc.demo.Linux" and execute "unzip then mysql -e "source unzipped file"
if I run on "pc.dev.XP" manually :
putty -load "myconf" -l Fred -pw XXX -m script.file.that.unzip.and.integrates.sql
this works perfectly.
Same for :
cd \cygwin\bin
ssh Fred#dest "cd /var/www && ls -al"
If I try to exec() in php (wamp installed on "pc.dev.XP") those scripts they hangs. I'm pretty sure this is because the user is "SYSTEM" and not "Fred", and putty or ssh ask for a password but maybe I'm wrong.
Anyway I'm looking for a way to automate those 4 tasks I've described and I'm stuck because exec() hangs. There's no problem with safe_exec_mode or safe_exec_dir directives, they're disabled on the development machine, thus exec() works pretty well if I try some basic stuff like exec("dir")
Any idea what I could do / check / correct ?
I'm not sure if this is what you need, but I typically use a construct like this to sync databases across machines:
php extractFromDb.php | ssh user#remote.com "mysql remoteDatabaseName"
This executes the PHP script locally, and pipes the SQL commands the script prints out through SSH straigt into the remote mysql process which executes them in the remote database.
If you need compression, you can either use SSH's -C switch, or integrate the use of your compression program of choice like this:
php extractFromDb.php | gzip -9 | ssh user#remote.com "gunzip | mysql remoteDatabaseName"
You want to do this from PHP running under apache, as in I go to http://myWebserver.com/crazyScript.php and all this happens? Or you just want to write your scripts in PHP and invoke them via cmd line?
If you want the first solution, try running your apache/iss under a different user that has credentials to perform all those tasks.
"if I run on the development PC manually this works perfectly.".
Why not do it like that? When you run that script, I assume you're connecting to the local SSH server on the dev machine. When you do this, you are using the credentials Fred, so everything works. When you run the PHP script, you are right that it is probably running as SYSTEM.
Try either changing the user that apache is running as or use php to connect to the local ssh thereby using alternate credentials.
Here's what I did :
a batch file that :
Calls a php file via "php.exe my_extract_then_compress_then_ftp.php"
Calls rsync to synchronize the source folder
Calls putty -l user -pw password -m file_with_ssh_commands_to_execute
It works like a charm.