I have a LAMP database with a php script that copies an image from a server to the local server. When I run the script via browser, it creates the directory, but does not copy images into the folder. When the script runs it posts (echos) the command to the browser For example:
scp -i /home/jim/.ssh/id_rsa root#192.168.0.120:/images/55036249
/056-0039.* "/home/jim/images/_Retrieve/CARL EVANSON/2010-03-20
55036249-056-0039.jpg"
When I run the above from a shell, it works, it copies the image.
This is the script:
$foo = sprintf("mkdir \"/home/jim/images/_Retrieve/%s\"",$name);
system($foo);
$foo = sprintf("scp -i /home/jim/.ssh/id_rsa root#192.168.0.120:/images
/%s/%03d-%04d.* \"/home/jim/images/_Retrieve/%s/%s
%s-%03d-%04d.jpg\"",$jid,$card,$image,$name,$date,$jid,$card,$image);
The script appears to be outputting the correct command, because it works when I run it in a shell. When I run it in a browser it makes the folder but does not copy the images. When I look at the httpd/error_log it says "Host Key Verification Failed."
Why doesn't it work when I run it via the browser?
Thanks in advance.
Related
I'm trying to generate Android apk files from shell script, I want to execute a shell script file from PHP. When I run the shell script in the terminal, it works perfectly. If I try to run the script using PHP, the shell script doesn't execute all the commands. The ls command in shell script works perfectly, but when executing using PHP, other commands doesn't work. I'm using xampp server in a Linux environment.
My shell script
cd /home/user/AndroidStudioProjects/msvep4247-inghamautogroup-pulse-and/
./gradlew assembleDebug
cp -fr app/build/outputs/apk/app-debug.apk /opt/lampp/htdocs/sample/apk
ls
Shell script ls output
app autolead_data_format.pdf build build.gradle cheek gradle
gradle.properties gradlew gradlew.bat lib local.properties
msvep4247-inghamautogroup-pulse-and.iml settings.gradle
My PHP script
<?php
echo shell_exec('ls');
echo shell_exec('./generateApk.sh');
?>
PHP script ls output
generateApk.sh generate.php APK
Note: ls outputs file names in the folder
I set all the file permissions for shell script in the xampp server. Can anyone describe where I'm mistaken? Awaiting responses...
Just use the full path to the script/executable, because the environment is different when running from php.
It seems that PATH environment variable in PHP code that is executed in the web server is more limited than the one in the shell you're working in. But you can change environment variables in PHP, and the commands you start from it will see those changes.
<?php
// set content type so the output is more readable in the browser
header('Content-Type: text/plain');
// set $PATH to some limited value
putenv('PATH=/bin:/sbin');
// verify, note that we have to use full path to 'env'
print(shell_exec("/usr/bin/env|grep '^PATH='"));
// this command won't run (assuming its full path is /usr/bin/id)
print(shell_exec("id"));
// add more directories to $PATH
putenv('PATH=/bin:/sbin:/usr/bin:/usr/sbin');
// verify again, we can use env without specifying the path this time
print(shell_exec("env|grep '^PATH='"));
// this command will
print(shell_exec("id"));
So you have to write putenv('PATH=<your_shell_PATH_contents>'); at the top of your PHP script. Using full path to the shell script alone won't help if the script itself uses relative paths to binaries it starts.
I have setup 2 Ubuntu machines: 192.168.1.104 & 192.168.1.105 have installed ssh on both the machines generated ssh-keygen on 104 machine and added key to both the ip-addresses.
I want to copy file from one 192.168.1.104 to 192.168.1.105 through php.
I tried this command scp /home/tejas/hadoop/conf/core-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/core-site.xml
through shell script the file gets copied perfectly but when I run the same command through php-script
<?php
$output = shell_exec('scp /home/tejas/hadoop/conf/core-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/core-site.xml');
?>
It doesnot show any error but file doesnot get copied. Also tried similar with exec() and also tried rysnc instead of scp rsync -avzh /home/tejas/hadoop/conf/mapred-site.xml tejas#192.168.1.105:/home/tejas/hadoop/conf/mapred-site.xml still no luck.
both the commands are working perfectly through shell script but not working through php
I checked php is not in safe-mode and shell_exec() or exec() is not disabled in php.ini
exec() and shell_exec() get executed by the user running the php script (usually www-data on Ubuntu but could be apache or something else). It's likely that this user does not have permission on the files/folders. One solution is to create a new user Group and add the user (www-data) to this Group, then set the proper ownership/permissions on the files/folders to copy/be copied to.
I'm trying to run AppleScript in PHP. The following is my code:
<?php
echo "Hello World!";
shell_exec('osascript -e \'tell app "Finder" to make new Finder window\'');
shell_exec('osascript -e \'display dialog "hello" \'');
shell_exec('osascript ~/openExcel.scpt');
echo "DONE";
?>
When I run each command individually in terminal, I can get the expected result. However, when I open this .php file in server as a webpage, I can only get "Hello World!DONE". The middle part (3 exec commands) have no output result at all. I don't understand.
As far as I know you can't execute AppleScript from a webserver for security reasons. The problem is that AppleEvents can only be send through the window server by the user that's only currently logged in as the windowserver user (console user) or as root. By default apache will run as _www as it's shells and _www user can't run osascript on the command line.
The workaround is running the shell command as console user or as root.
Create an AppleScript Application and save it with the app extension, then link from the PHP page. I use it on some projects in localhost, it works.
On my home network, I would like to run a python script on server2 remotely from server1. Both servers are on the home network. I need to upload a file to server2 and run the python script. The python script takes several hours to complete and so I would like to run it inside a screen on server2. I'm trying to implement this using php and some bash scripting.
My php script on server 1 runs a bash script on the same server. The bash script uses: [ssh -t user#server screen 'sudo python pyth_script.py'] to attempt to run the python script on server2. Please note that I am using the -t option. The bash script also has a scp command to copy a file from server1 to server2. I have used keys to enable ssh commands from server1 to server2 without requiring a password.
When I run the bash script from the command line, it functions perfectly. The screen on server2 is activated and the python program runs inside it. I have run the bash script as the normal user, as root and as www-data (the php script is run through apache and is user www-data). Under any of the users, the bash script works as expected when run from the command line.
When I run the bash script via the php script (click on an html form that fires off the php script), then the scp command works correctly and the file is transferred, however the python script does not run. The output from the php line containing "ssh ... screen ..." returns "Must be connected to a terminal.", but I'm using the -t option and as I mentioned, the bash script runs as expected when run from the command line.
Any ideas what I'm doing wrong?
What is the best way to run a python script remotely using a web interface?
Thanks,
rb3
My guess: ssh doesn't do tty if it, itself, isn't being run in a tty, which exec() probably isn't doing.
man ssh says "Multiple -t options force tty allocation, even if ssh has no local tty" but it's not clear to me what it means by "no local tty".
idk... I'm kinda thinking maybe you ought to use phpseclib, a pure PHP SSH2 implementation. eg.
<?php
include('Net/SSH2.php');
$ssh = new Net_SSH2('www.domain.tld');
if (!$ssh->login('username', 'password')) {
exit('Login Failed');
}
echo $ssh->exec('screen 'sudo python pyth_script.py'');
?>
And if you need PTY / TTY check this out:
http://phpseclib.sourceforge.net/ssh/pty.html
That said, keep in mind I'm not a screen expert. Maybe you want to be using nohup and &? Maybe you should tag your question screen too idk.
Here's my goal :
I have a Windows XP PC with all the source code in it and a development database.
Let's call it "pc.dev.XP".
I have a destination computer that runs Linux.
Let's call it "pc.demo.Linux".
Here's what I've done on "pc.dev.XP" (just so you get the context) :
installed all cygwin stuff
created a valid rsa key and put it on the dest
backup computer so that ssh doesn't
ask for a password
rsync works pretty well this way
If i try to do this on "pc.dev.XP" via a command line :
cd \cygwin\bin
ssh Fred#pc.demo.Linux "cd /var/www && ls -al"
this works perfectly without asking a password
Now here's what I want to do on the "pc.dev.XP":
launch a php script that extract the dev. database into a sql file
zip this file
transfer it via ftp to the "pc.demo.Linux"
log to the "pc.demo.Linux" and execute "unzip then mysql -e "source unzipped file"
if I run on "pc.dev.XP" manually :
putty -load "myconf" -l Fred -pw XXX -m script.file.that.unzip.and.integrates.sql
this works perfectly.
Same for :
cd \cygwin\bin
ssh Fred#dest "cd /var/www && ls -al"
If I try to exec() in php (wamp installed on "pc.dev.XP") those scripts they hangs. I'm pretty sure this is because the user is "SYSTEM" and not "Fred", and putty or ssh ask for a password but maybe I'm wrong.
Anyway I'm looking for a way to automate those 4 tasks I've described and I'm stuck because exec() hangs. There's no problem with safe_exec_mode or safe_exec_dir directives, they're disabled on the development machine, thus exec() works pretty well if I try some basic stuff like exec("dir")
Any idea what I could do / check / correct ?
I'm not sure if this is what you need, but I typically use a construct like this to sync databases across machines:
php extractFromDb.php | ssh user#remote.com "mysql remoteDatabaseName"
This executes the PHP script locally, and pipes the SQL commands the script prints out through SSH straigt into the remote mysql process which executes them in the remote database.
If you need compression, you can either use SSH's -C switch, or integrate the use of your compression program of choice like this:
php extractFromDb.php | gzip -9 | ssh user#remote.com "gunzip | mysql remoteDatabaseName"
You want to do this from PHP running under apache, as in I go to http://myWebserver.com/crazyScript.php and all this happens? Or you just want to write your scripts in PHP and invoke them via cmd line?
If you want the first solution, try running your apache/iss under a different user that has credentials to perform all those tasks.
"if I run on the development PC manually this works perfectly.".
Why not do it like that? When you run that script, I assume you're connecting to the local SSH server on the dev machine. When you do this, you are using the credentials Fred, so everything works. When you run the PHP script, you are right that it is probably running as SYSTEM.
Try either changing the user that apache is running as or use php to connect to the local ssh thereby using alternate credentials.
Here's what I did :
a batch file that :
Calls a php file via "php.exe my_extract_then_compress_then_ftp.php"
Calls rsync to synchronize the source folder
Calls putty -l user -pw password -m file_with_ssh_commands_to_execute
It works like a charm.