On my home network, I would like to run a python script on server2 remotely from server1. Both servers are on the home network. I need to upload a file to server2 and run the python script. The python script takes several hours to complete and so I would like to run it inside a screen on server2. I'm trying to implement this using php and some bash scripting.
My php script on server 1 runs a bash script on the same server. The bash script uses: [ssh -t user#server screen 'sudo python pyth_script.py'] to attempt to run the python script on server2. Please note that I am using the -t option. The bash script also has a scp command to copy a file from server1 to server2. I have used keys to enable ssh commands from server1 to server2 without requiring a password.
When I run the bash script from the command line, it functions perfectly. The screen on server2 is activated and the python program runs inside it. I have run the bash script as the normal user, as root and as www-data (the php script is run through apache and is user www-data). Under any of the users, the bash script works as expected when run from the command line.
When I run the bash script via the php script (click on an html form that fires off the php script), then the scp command works correctly and the file is transferred, however the python script does not run. The output from the php line containing "ssh ... screen ..." returns "Must be connected to a terminal.", but I'm using the -t option and as I mentioned, the bash script runs as expected when run from the command line.
Any ideas what I'm doing wrong?
What is the best way to run a python script remotely using a web interface?
Thanks,
rb3
My guess: ssh doesn't do tty if it, itself, isn't being run in a tty, which exec() probably isn't doing.
man ssh says "Multiple -t options force tty allocation, even if ssh has no local tty" but it's not clear to me what it means by "no local tty".
idk... I'm kinda thinking maybe you ought to use phpseclib, a pure PHP SSH2 implementation. eg.
<?php
include('Net/SSH2.php');
$ssh = new Net_SSH2('www.domain.tld');
if (!$ssh->login('username', 'password')) {
exit('Login Failed');
}
echo $ssh->exec('screen 'sudo python pyth_script.py'');
?>
And if you need PTY / TTY check this out:
http://phpseclib.sourceforge.net/ssh/pty.html
That said, keep in mind I'm not a screen expert. Maybe you want to be using nohup and &? Maybe you should tag your question screen too idk.
Related
I'm controlling my home automation with a Raspberry Pi and webserver. I use this line to turn on my lights (sending an RC signal to wireless power socket).
exec("sudo ./../../../home/pi/wiringPi/examples/lights/action 63 A on");
This is working when the script is ran by a cron job, but when I manually want to execute this command (using a php form and buttons), it does not work. I tried adding $output, $return); and checking $return, and that confirms the exec() function is not executed. However, when I use something like exec("whoami");, the script is executed.
What is it about my command that makes it work only in cron jobs? I had this working once, don't know what happened. Manually sending the command via ssh in the terminal is working normally.
Fix permissions on this command and run it without sudo (for ubuntu default php user is www-data, for suse wwwrun, ...).
Check the path also. I recommend absolute paths.
I have a LAMP database with a php script that copies an image from a server to the local server. When I run the script via browser, it creates the directory, but does not copy images into the folder. When the script runs it posts (echos) the command to the browser For example:
scp -i /home/jim/.ssh/id_rsa root#192.168.0.120:/images/55036249
/056-0039.* "/home/jim/images/_Retrieve/CARL EVANSON/2010-03-20
55036249-056-0039.jpg"
When I run the above from a shell, it works, it copies the image.
This is the script:
$foo = sprintf("mkdir \"/home/jim/images/_Retrieve/%s\"",$name);
system($foo);
$foo = sprintf("scp -i /home/jim/.ssh/id_rsa root#192.168.0.120:/images
/%s/%03d-%04d.* \"/home/jim/images/_Retrieve/%s/%s
%s-%03d-%04d.jpg\"",$jid,$card,$image,$name,$date,$jid,$card,$image);
The script appears to be outputting the correct command, because it works when I run it in a shell. When I run it in a browser it makes the folder but does not copy the images. When I look at the httpd/error_log it says "Host Key Verification Failed."
Why doesn't it work when I run it via the browser?
Thanks in advance.
I have a php webpage located on Webserver1, which is from Host1.
I also have a bash script located in Gameserver1 which is from Host2.
Is there any way to send a command from Webserver1 to Gameserver1 to execute the bash file? The webpage and file are on different VPSs. Both are running Debian 7.
The script is literally one line, to execute a java command via a screen, so the server can start if a player notices it's down. The command's available already so it doesn't need to be a secure way of hiding what the command is.
There are 2 ways I can think of. either create a bash file in Webserver1 that connects through ssh and executes the bash script you need on Gameserver1. then run it through php with exec() command.
Or you can create a php file in Gameserver1 that uses exec() to execute the bash script you need on Gameserver1 and call it using file_get_contents() on Webserver1, which is not that secure since anyone can call that file and run your script.
I would like to execute a cronjob for a routine task every X hours. The cronjob basically executes a shell script which in turn uses a WGET command to download files from a remote server. However, before I run this shell script I want the cronjob to execute a php script which will check whether the update's available (there's no point in wasting BW and downloading the same file over and over again) and if it is, it should pass on the update URL to the shell script which in turn uses the WGET command.
The cronjobs are set from the hosts Admin Panel. There is no other way around it. Being a shared hosting service, I am not allowed access to other functions on PHP which might do the task for me either.
Is this possible? I am Linux illiterate. I have installed a few RPM's on Fedora but that's about it. Please bear with me. Thanks!
Just pass --timestamping to your wget command.
Alternatively if you are more familiar with PHP's ways you can check this question for a usable method.
Use a curl HEAD request to get the file's headers and parse out the Last-Modified: header.
To use a php script as a regular command line executable use this as a starting point:
#!/bin/env php
<?php
echo "Hello World\n";
Save the file without the .php and tuck it somewhere that your server won't serve it.
Next, set the executable bit so that you can execute the script like a regular program
(u+x in the following command means grant the [u]ser e[x]ecute privileges for helloworld, and chmod is the command that unix variants use to set file permissions)
Omit the $ in the following sequence, as it represents the command prompt
$ chmod u+x helloworld
now you can execute your commandline script by calling it in the bash prompt:
$ ls
helloworld
$ ./helloworld
Hello World
$
From here you can get the full path of the executable script:
$ readlink -f helloworld
/home/SPI/helloworld
And now you can install the cronjob using the path to your executable script.
Here's my goal :
I have a Windows XP PC with all the source code in it and a development database.
Let's call it "pc.dev.XP".
I have a destination computer that runs Linux.
Let's call it "pc.demo.Linux".
Here's what I've done on "pc.dev.XP" (just so you get the context) :
installed all cygwin stuff
created a valid rsa key and put it on the dest
backup computer so that ssh doesn't
ask for a password
rsync works pretty well this way
If i try to do this on "pc.dev.XP" via a command line :
cd \cygwin\bin
ssh Fred#pc.demo.Linux "cd /var/www && ls -al"
this works perfectly without asking a password
Now here's what I want to do on the "pc.dev.XP":
launch a php script that extract the dev. database into a sql file
zip this file
transfer it via ftp to the "pc.demo.Linux"
log to the "pc.demo.Linux" and execute "unzip then mysql -e "source unzipped file"
if I run on "pc.dev.XP" manually :
putty -load "myconf" -l Fred -pw XXX -m script.file.that.unzip.and.integrates.sql
this works perfectly.
Same for :
cd \cygwin\bin
ssh Fred#dest "cd /var/www && ls -al"
If I try to exec() in php (wamp installed on "pc.dev.XP") those scripts they hangs. I'm pretty sure this is because the user is "SYSTEM" and not "Fred", and putty or ssh ask for a password but maybe I'm wrong.
Anyway I'm looking for a way to automate those 4 tasks I've described and I'm stuck because exec() hangs. There's no problem with safe_exec_mode or safe_exec_dir directives, they're disabled on the development machine, thus exec() works pretty well if I try some basic stuff like exec("dir")
Any idea what I could do / check / correct ?
I'm not sure if this is what you need, but I typically use a construct like this to sync databases across machines:
php extractFromDb.php | ssh user#remote.com "mysql remoteDatabaseName"
This executes the PHP script locally, and pipes the SQL commands the script prints out through SSH straigt into the remote mysql process which executes them in the remote database.
If you need compression, you can either use SSH's -C switch, or integrate the use of your compression program of choice like this:
php extractFromDb.php | gzip -9 | ssh user#remote.com "gunzip | mysql remoteDatabaseName"
You want to do this from PHP running under apache, as in I go to http://myWebserver.com/crazyScript.php and all this happens? Or you just want to write your scripts in PHP and invoke them via cmd line?
If you want the first solution, try running your apache/iss under a different user that has credentials to perform all those tasks.
"if I run on the development PC manually this works perfectly.".
Why not do it like that? When you run that script, I assume you're connecting to the local SSH server on the dev machine. When you do this, you are using the credentials Fred, so everything works. When you run the PHP script, you are right that it is probably running as SYSTEM.
Try either changing the user that apache is running as or use php to connect to the local ssh thereby using alternate credentials.
Here's what I did :
a batch file that :
Calls a php file via "php.exe my_extract_then_compress_then_ftp.php"
Calls rsync to synchronize the source folder
Calls putty -l user -pw password -m file_with_ssh_commands_to_execute
It works like a charm.