Running Shell Script from PHP Page - php

What I'd like to do:
When mydomain.com/index.php is viewed, run a script, that logs in on another machine (different from the one serving the index.php) in the local net of the server and runs a Python script.
What I have so far:
In the index.php I call
shell_exec('/var/www/dir/save.sh '.$ARG1.' '.$ARG2);
The script save.sh:
#!/usr/bin/expect -f
set cmd "python /path/to/python_script.py"
set file [lindex $argv 0]
set content [lindex $argv 1]
spawn ssh usr#local_ip "$cmd $file $content"
expect "assword:"
send "PWD\r"
interact
I ran sudo chmod www-data+x save.sh to enable the PHP-Site to run the command. When I run sudo -u www-data ./save.sh arg1 arg2 in the terminal, everything works as expected and I can see the output of python_script.py (that I print for testing).
However, when calling index.php and echoing the output of the shell_exec-Call, I see everything up to user#ip's password: but not the output of python_script.py and I can check, that the script has not been invoked on the other machine.
Any idea, what I'm missing?
I'm glad for any help! :)
UPDATE
Thanks for the quick responses. Sorry for not making myself clear enough.
First of all: This is not used for a public/commercial/elaborate website but rather for private fun and mydomain.com is locked by auth_basic.
My setup: Server1 is hosting mydomain.com. Server2 is running some stuff for private entertainment, e.g. smart home software and Telegram bots.
What I want to do: When calling mydomain.com/index.php, I want to save same parameters (actually just True or False) to Server2, to use them for some of the scripts running there. Server1 and Server2 are in the same local network but only Server1 is reachable from the outside, hence the idea to login via SSH and run a Python-Script.
I appreciate any advise on how to do this in a better way or to make my inteded solution work. :)
UPDATE 2:
Thanks symcbean for pointing to errors in the script. I have to admit, that I copy pasted it from other stackoverflow threads.
Thanks also to ADyson for pointing out better ways to handle the issue. I will definetly keep them in mind for bigger projects.
In the meantime I was able to get the script running. It is still very basic, does not track excection of the Python script and does not handle errors. Again, I wan't to point out, that I use this for private usage only and mydomain.com/index.php will only be up for limited time (it will be an andvent calender and will be taken offline after xmas). But just for completeness sake, here is what I came up with:
#!/usr/bin/expect -f
set cmd "python /path/to/python_script.py"
set file [lindex $argv 0]
set content [lindex $argv 1]
spawn ssh usr#local_ip
expect "assword:"
send "PWD\r"
expect "*\$ "
send "$cmd $file $content\r"
expect "*\$*"
send "exit\r"
interact

Leaving aside the peculiar choices before you got to that point, what exactly were you expecting to happen when the expect script gets to
interact
?
There is nothing in your shell script to start the python script, nor to track its execution, nor to exit the shell. Meanwhile you PHP code is blocked until the local script exits.
While there are lots of ways of fixing this, I am hesitant to suggest any without a better understanding of why you chose such an elaborate mechanism to handle the invocation.

Related

why is my "at job" not executing my php script when created through a php webpage?

$output = shell_exec('echo "php '.$realFile.'" | at '.$targTime.' '.$targDate.' 2>&1');
print $output;
Can someone please help me figure out why the above line isn't doing what it's supposed to be doing? The idea is for it to create an 'at' job that will execute a php script. If I switch to the user apache(which will ideally control the at function when the php file is complete) I can run
echo "php $realFile.php" | at 00:00 05/30/17
and it'll do EXACTLY what I want. The problem is in the above snippet from my php file it will not create the at job correctly. when I do a at -c job# on both of them the job made from my file is about a 3rd the length missing the User info and everything. It basically starts at PATH= and goes down. Doesn't include HOSTNAME=, SHELL=, SSH_CLIENT=, SSH_TTY=, USER=. I assume it needs most of this info to run correctly. The end output (below)is always the same though it just doesn't have any of the top part for some reason. Let me know if you need more info. I didn't want to paste all of my code here as it contains job specific information.
${SHELL:-/bin/sh} << 'marcinDELIMITER0e4bb3e8'
php "$realFile".php
marcinDELIMITER0e4bb3e8
It doesn't seem to be a permission issue because I can su to apache and run the exact command needed. The folder the files are located in are also owned by apache. I've also resulted to giving each file I try to run 777 or 755 permissions through chmod so I don't think that's the issue.
I figured out a coupe ways around it a while back. The way I'm using right now is an ssh2 connect to my own server as root and creating it that way. No compromise as you have to enter the password manually each time. Really bad work around. The main issue is that apache doesn't have the correct permissions to do everything needed for the AT job so someone figuring that out would be awesome. Another option I found on a random webpage would be to use sudo through the php script, but basically the same minus having to reconnect to your own server. Any other options would be appreciated.
Reading the manual and logs would be a good place to start. In particular:
The value of the SHELL environment variable at the time of at invocation will determine which shell is used to execute the at job commands. If SHELL is unset when at is invoked, the user’s login shell will be used; otherwise, if SHELL is set when at is invoked, it must contain the path of a shell interpreter executable that will be used to run the commands at the specified time.
Other things to check are that the user is included in at.allow, SELinux is disabled and the webserver is not running chrrot.

Webserver to trigger ssh shell script

I have googled this a lot but none of the results I have found worked for me. So far, I have only tried to do this with php, but cgi, javascript or whatever works is fine with me, as long as it gets the job done.
I would like to access a certain URL on my debian webserver. Once opened in the browser, this file shall execute the following shell commands. No buttons or links. If possible, I'd like to just open the URL, then have the script being started.
ssh user#192.168.189.12 <<'ENDSSH'
osascript ~/Desktop/Scripts/script.scpt
When running this as a regular .sh file it works fine. I have created lockkeys so that no password is prompted when connecting from A to B. What can I do to trigger this from, for example, the browser on my smartphone?
I am not trying to connect directly from any device to the Mac containing script.scpt. It is essential that the debian server triggers it and that it is executed by the webserver.
I just started learning about terminal comments, scripts and so on, so I have very basic knowledge of the subject. Please be patient with me.
Thanks in advance for your help :)
for simplicity I prefer to create a bash script. Let's call it
/var/NONwebroot/sshcoolstuff.sh
#!/bin/bash
ssh user#192.168.189.12 <<'ENDSSH'
osascript ~/Desktop/Scripts/script.scpt
make sure it is executable
<?php
exec('/var/NONWwebroot/sshcoolstuff.sh');
?>
Now I'd recommend putting some protection on that PHP script. Either limit who has access to it by IP address, or a password, or both.
here is a test bash script for you
#!/bin/sh
cat > test << EOF
Hello World!
This is my test text file.
You
can also
have
a whole lot
more text and
lines
EOF

php freezes when executing an external sh script

I'll try to explain my problem in a time line history:
I've tried to run several external scripts from php and to return the exit code to the server with an ajax call again.
A single call should start or stop an service on that machine. That works fine on this developing machine.
OS : raspbian Os
Webserver : NginX 1.2.1
Php : 5.4.3.6
However I've exported the code to a larger machine with much more power and everything seemed to work fine but one thing:
A single call causes the php-fpm to freezes and never to come back. By detailed examination I found out, that the call created a zombie process I can not terminate (even with sudo).
OS : Ubuntu
Webserver : NginX 1.6.2
Php : 5.5.9
The only solution seemed to stop the php-fpm proc and than to restart it again. Then everything seems to work fine again, as long as I try to call that script again.
Calling php line
exec("sudo ".$script, $output, $return_var);
(With all variables are normal 'strings' with no special chars)
Start script
#!/bin/sh
service radicale start 2>&1
The service by the way started, but every time the webserver freezes and I had to restart php manually, but that is not acceptable (even for a web server). But only for that single script and only for that service (radicale) with that solemn command (start).
Searching in Google brought me to the point that there is a conflict between the php commands exec() and session_start().
Links:
https://bugs.php.net/bug.php?id=44942
https://bugs.php.net/bug.php?id=44994
Their conclusion was, that that bug could be worked around with such a construct:
...
session_write_close();
exec("sudo ".$script, $output, $return_var);
session_start();
...
But that, for my opinion, was no debugging, but more a helplessly workaround, because you loose the functionality of letting the user know, that his actions have fully functioned, but more let him believe an error has occurred. Much more confusing is the fact, that it runs fully on the Raspberry Pi A, but not on a 64-bit machine with a much larger CPU and 8 GB RAM.
So is there a real solution anywhere or is this workaround the only way to solve that problem? I've read a article about php having some probs with exec/shell_exec and the recognition of the return value? How can that be lost? Someone's having a guess?
THX for reading that long awful English, but I'm no native speaker and was no well listening student in my lessons.
It is likely the case that the new machine simply is not set up the way the Raspberry PI was setup -
You need to do a few things in your shell before this will work on your larger machine:
1). Allow php to use sudo.
sudo usermod -G sudo -a your-php-user
Note that to get the username for your-php-user, you can just run a script that says:
<?php echo get_current_user(); ?> - or alternatively:
<?php echo exec('whoami'); ?> -
2). Allow that user to use sudo without a password
sudo visudo - this command will open /etc/sudoers with a failsafe to keep you from botching anything.
Add this line to the very end:
your-php-user ALL=(ALL) NOPASSWD: /path/to/your/script,/path/to/other/script
You can put as many scripts there, separated by commas, as you need.
Now, your script should work just fine.
AGAIN, please note that you need to change your-php-user to whatever your php user is.
Hope this helps!
This is not a real solution, but it's a better solution than none.
Calling a bash script with
<?php
...
exec("sudo ".$script, $output, $return_var);
...
?>
ends only in this special case in a zombie Thread. As php-fpm waits in expectation for a result, it still holds the line, not giving up nor time outs for the rest of its thread still living. So every other request to the php server is still in queue and will never be processed. That may be okay for some long living or working threads, but my request was done in some [ms].
I did not found the cause for this. As far as I could do debugging, I wasn't the triggered Radicale process fault, for this on gave a any time clean and brave 0 as in return. It seemed that a php process just couldn't get a return line from it and so it still waits and waits.
No time left I changed the malfunction script from
#!/bin/sh
service radicale start 2>&1
to
#!/bin/sh
service radicale start > /dev/null 2>&1 &
... so signaling every returning line to nirvana and disconnecting all subroutines. For now the server did not hung itself up and works as desired. But the feeling this may be a major bug in php still stays in the back of my head, with the hope, that - someday - someone may defeat that bug.

php exec(svn commit) hangs

I know there's been similar questions but they don't solve my problem...
After checking out the folders from the repo (which works fine).
A method is called from jquery to execute the following in php.
exec ('svn cleanup '.$checkout_dir);
session_write_close(); //Some suggestion that was supposed to help but doesn't
exec ('svn commit -m "SAVE DITAMAP" '.$file);
These would output the following:
svn cleanup USER_WORKSPACE/0A8288
svn commit -m "SAVE DITAMAP" USER_WORKSPACE/0A8288/map.ditamap
1) the first line (exec ('svn cleanup')...executes fine.
2) as soon as I call svn commit then my server hangs, and everything goes to hell
The apache error logs show this error:
[notice] Child 3424: Waiting 240 more seconds for 4 worker threads to finish.
I'm not using the php_svn module because I couldn't get it to compile on windows.
Does anyone know what is going on here? I can execute the exact same cmd from the terminal windows and it executes just fine.
since i cannot find any documentation on jquery exec(), i assume this is calling php. i copied this from the documentation page:
When calling exec() from within an apache php script, make sure to take care of stdout, stderr and stdin (as in the example below). If you forget this and your shell command produces output the sh and apache deamons may never return (they will normally time out after a few minutes). From the calling web page the script may seem to not return any data.
If you want to start a php process that continues to run independently from apache (with a different parent pid) use nohub. Example:
exec('nohup php process.php > process.out 2> process.err < /dev/null &');
hope it helps
Okay, I've found the problem.
It actually didn't have anything to do with the exec running the background, especially because a one file commit doesn't take a lot of time.
The problem was that the commit was expecting a --username and --password that didn't show up, and just caused apache to hang.
To solve this, I changed the svnserve.conf in the folder where I installed svn and changed it to allow non-auth users write access.
I don't think you'd normally want to do this, but my site already authenticates the user name and pass upon logging in.
Alternatively you could

PHP from commandline starts gui programs but apache doesn't

First, I read some threads by people with similar problems but all answers didn't go beyond export DISPLAY=:0.0 and xauth cookies. So here is my problem and thanks in advance for your time!
I have developed a little library which renders shelves using OpenGL and GLSL.
Last few days I wrapped it in a php extension and surprisingly easy it works now.
But the problem is it works only when I execute the php script using the extension from commandline
$php r100.php(i successfuly run this from the http user). The script is in the webroot of apache and if I request it from the browser I get ** CRITICAL **: Unable to open display in apache's error_log.
So, to make things easier to test and to be sure that the problem is not in the library/extension, at the moment I just want to start xmms with following php script.
<?php
echo shell_exec("xmms");
?>
It works only from the shell too.
I've played with apache configuration so much now that I really dont know what to try.
I tried $xhost + && export DISPLAY=:0.0
In the http.conf I have these
SetEnv DISPLAY :0.0 SetEnv XAUTHORITY /home/OpenGL/.Xauthority
So my problem seems to be this:
How can I make apache execute php script with all privileges that the http user has, including the environment?
Additional information:
HTTP is in video and users groups and has a login shell(bash).
I can login as http and execute scripts with no problem and can run GUI programs which show up on display 0.
It seems that apache does not provide the appropriate environment for the script.
I read about some difference between CLI/CGI but cant run xmms with php-cgi too...
Any ideas for additional configuration?
Regards
Sounds bit hazard, but basically you can add even export DISPLAY=:0.0 to apache start-up script (like in Linux /etc/init.d/httpd or apache depending distro).
And "xhost +" need to be run on account which is connected to local X server as user, though I'm only wondering how it will work as php script should only live while apache http request is on-going.
Edit:
Is this is kind of application launcher?, you can spawn this with exec("nohub /usr/bin/php script.php &"); .. now apache should be released and php should continue working in background.
In your console, allow everyone to use the X server:
xhost +
In your PHP script, set the DISPLAY variable while executing the commands:
DISPLAY=:0 glxgears 2>&1

Categories