I am having an issue with running my PHP application from the command line.
I recently created a new Google Cloud Engine CentOS instance to host my PHP application.
This application has been running away fine on a different RHEL box.
The application is kicked off from a PHP script using a command similar to...
$command = 'bash -c "exec nohup setsid runPHPScript > /dev/null 2>&1 &"';
exec($command, $output, $returnVar);
runPHPSCript is a linux script that essentially runs the actual PHP command...
php myScript.php
myScript.php then goes off and connects to various webservices etc...
When I try to run this on my new instance (bearing in mind this all worked fine on my RHEL box) I get the following SOAP error...
(faultcode: HTTP, faultstring: Could not connect to host)
The SOAP setup/connection to the endPoint WSDL is actually successful but as soon as I try to send the request I get the error above.
I've been debugging this and reading up a bit and can confirm the following...
HTTP & HTTPS are both enabled on the GCE instance.
I have verified that both PHP & Apache are using the same php.ini (more on this below)
I have checked both configurations using phpinfo() and php -i and can see the various SSL entries in the data
The strange part is that if I open myScript.php in a browser (its a LAMP stack) it connects fine to the webservice and I can see the valid response. This lead me to think the problem was different php.ini's being used.
Also, at the command prompt, if I just run myScript.php directly it also works fine...
php 'myScript.php'
returns a valid response too.
So the problem only seems to occur when I try to kick off the application using BASH.
Anyone got any ideas?
Robert
Cleaning up.
As per Raidenace suggestion, just running exec directly worked fine too...
"just curious...why cant you just do an exec("php myScript.php"); in your code direcctly?"
Related
I have a Windows Server 2016 VPS with Plesk and PHP 7.1x.
I am trying to execute a simple AutoHotKey script from PHP using the following command:
<?php shell_exec('start /B "C:\Program Files\AutoHotkey\AutoHotkey.exe" C:\inetpub\vhosts\mydomain.com\App_Data\myahkscript.ahk'); ?>
This is the only line on the page. I have tried different ahk scripts, the current one simply creates a MsgBox.
When I execute my php page, on VPS Task Manager I see three processes created with the expected USR: cmd.exe, conhost.exe and php-cgi.exe. However, my PHP page just sits waiting on the server and nothing actually happens on the server.
I have also tried the same line except replacing shell_exec with exec. This seems to make no difference. I have tried without start /b with both commands. In that case the PHP page completes but no new processes are started.
I cannot find any errors in any logs: Mod_Security, Plesk Firewall, IIS.
Any ideas?
EDIT:
I tried my command from the VPS command prompt and immediately slapped in the face with the obvious issue of the space in 'Program Files'. I quoted the string as shown above and the command works. This eliminated the hang when running from PHP. However, the command still does nothing when executed from the web page.
EDIT:
Based on suggestions from the referenced post 'debugging exec()':
var_dump: string(0)""
$output: Array()
$return_val: 1
One point was that I would probably not be able to invoke GUI applications. That puts a damper on the idea.
I am unable to execute a source command in linux using php.All other commands are working except this one. I need to execute the following command.
source /root/Envs/ate/bin/activate
This activates the ate-Automatic Test Equipment.Once I activate it then I need to run a python script as the script accesses the remote server.
I am able to manually run it but I am creating a tool which will automatically do it.
<?php
exec("source /root/Envs/ate/bin/activate", $output, $return);
echo "Command returned $return, and output:\n";
echo exec("python box_upgrade-pradeepa.py");
?>
The above commands returns 1 which means there is an error.But I am not sure how to run the 'source command'. The python script will run only if the source command is successful.(the python command is correct as I replaced hello.py and it ran fine.)
Could you pls help me as I am really stuck for a week?
Thanks a lot..
I found out the error. Since I am doing it using php (for a web tool) the user is Apache. 'Apache' user is unable to access the script in root folder. Moving it to another directory, I am able to run the script fine.
Thanks all..
I have been trying unsuccessfully so far to write a php script that will run when a page is opened and that will launch metasploit!
I ve tried shell_exec and exec and all the other alternatives but although I can get it to do simple things (i.e. ls, cds etc) if I try msfconsole it doesnt do anything!
I have also tried a different script that launches firefox and again nothing happens!
Now i know that php runs on the server and I m not expecting to see a console or firefox opening in the clients machine! Instead in order to check if it works I am trying to echo out the output of the shell_exec!But anyway since im hosting the files on my machine (i.e. this is the server and a VM is the client) if it could actually launch firefox i should be able to see the app opening here in the same way as by just doing this from the command line!
What am I missing?
Is there any other way to do this?(i.e. Launch metasploit everytime a user opens up my page)
NOTE: I've tried specifying the full path for msfconsole but that didnt work either!
Heres what I have so far:
$output = shell_exec('/opt/local/libexec/metasploit3/msfconsole;show');
echo "<pre>$output</pre>";
The ";show" bit was used in order to actually make it run something and print some stuff but didnt make any difference!
When you run a gui application from the command prompt in a X window system, it will use the default display. When you run it using php which is embedded in apache webserver, the program may not know where to display the gui application.
there are 2 things to make this work.
The program that executes the gui application must have permission to use display
you need to tell the program which display to use.
I used the following in my php script
<?php
$cmd = `export DISPLAY=:0; gedit`;
shell_exec($cmd);
?>
and ran the script from terminal using php -f test.php
I got the gedit up and running.
You can test the same with the script in apache too.
Please add apache user with privileges to access display server
update: I just added the following in /etc/apache2/apache2.conf (I am using ubuntu)
User poomalai
Group poomalai
and restarted the web server
sudo service apache2 restart
now I accessed localhost/test.php
and Presto!! I got the gedit :)
Hope this helps
Here's my goal :
I have a Windows XP PC with all the source code in it and a development database.
Let's call it "pc.dev.XP".
I have a destination computer that runs Linux.
Let's call it "pc.demo.Linux".
Here's what I've done on "pc.dev.XP" (just so you get the context) :
installed all cygwin stuff
created a valid rsa key and put it on the dest
backup computer so that ssh doesn't
ask for a password
rsync works pretty well this way
If i try to do this on "pc.dev.XP" via a command line :
cd \cygwin\bin
ssh Fred#pc.demo.Linux "cd /var/www && ls -al"
this works perfectly without asking a password
Now here's what I want to do on the "pc.dev.XP":
launch a php script that extract the dev. database into a sql file
zip this file
transfer it via ftp to the "pc.demo.Linux"
log to the "pc.demo.Linux" and execute "unzip then mysql -e "source unzipped file"
if I run on "pc.dev.XP" manually :
putty -load "myconf" -l Fred -pw XXX -m script.file.that.unzip.and.integrates.sql
this works perfectly.
Same for :
cd \cygwin\bin
ssh Fred#dest "cd /var/www && ls -al"
If I try to exec() in php (wamp installed on "pc.dev.XP") those scripts they hangs. I'm pretty sure this is because the user is "SYSTEM" and not "Fred", and putty or ssh ask for a password but maybe I'm wrong.
Anyway I'm looking for a way to automate those 4 tasks I've described and I'm stuck because exec() hangs. There's no problem with safe_exec_mode or safe_exec_dir directives, they're disabled on the development machine, thus exec() works pretty well if I try some basic stuff like exec("dir")
Any idea what I could do / check / correct ?
I'm not sure if this is what you need, but I typically use a construct like this to sync databases across machines:
php extractFromDb.php | ssh user#remote.com "mysql remoteDatabaseName"
This executes the PHP script locally, and pipes the SQL commands the script prints out through SSH straigt into the remote mysql process which executes them in the remote database.
If you need compression, you can either use SSH's -C switch, or integrate the use of your compression program of choice like this:
php extractFromDb.php | gzip -9 | ssh user#remote.com "gunzip | mysql remoteDatabaseName"
You want to do this from PHP running under apache, as in I go to http://myWebserver.com/crazyScript.php and all this happens? Or you just want to write your scripts in PHP and invoke them via cmd line?
If you want the first solution, try running your apache/iss under a different user that has credentials to perform all those tasks.
"if I run on the development PC manually this works perfectly.".
Why not do it like that? When you run that script, I assume you're connecting to the local SSH server on the dev machine. When you do this, you are using the credentials Fred, so everything works. When you run the PHP script, you are right that it is probably running as SYSTEM.
Try either changing the user that apache is running as or use php to connect to the local ssh thereby using alternate credentials.
Here's what I did :
a batch file that :
Calls a php file via "php.exe my_extract_then_compress_then_ftp.php"
Calls rsync to synchronize the source folder
Calls putty -l user -pw password -m file_with_ssh_commands_to_execute
It works like a charm.
I have a Ruby script that's being used to do some API calls/screen scraping, but our main app is in PHP. Our PHP app is using shell_exec() to call the Ruby script.
The ruby script works great when called from the command lineābut it will randomly exits early when called via PHP's shell exec.
Here's an example of the Ruby script:
#!/usr/bin/env ruby
require 'rubygems'
require 'mysql'
require 'net/http'
require 'open-uri'
require 'uri'
require 'cgi'
require 'fileutils'
# Bunch of code here ... works fine
somePath = 'http://foo.com/bar.php'
# Seems to always exit when I do a Net::HTTP or open-uri call
post = Net::HTTP.post_form(URI.parse(somePath),{'id'=>ID,'q'=>'some query'})
data = post.body
# OR
data = open(somePath).read
# More code here ...
So, all I can deduce so far is that it's always exiting when I try to grab/read an external URL via net/http or open-uri calls. The pages I'm grabbing can accept POST or GET requests, but it seems to be exiting either way.
I'm outputting the results with PHP after the shell_exec call, but there are no error messages or exits. I do have messages being output by my Ruby script with "puts ...." here and there. Could that be a problem (I'm thinking 'no' because it doesn't exit with earlier puts messages)?
Again, it works fine when called from the shell. It's almost like the shell_exec call isn't waiting for the net/http call to finish.
Any ideas?
I'm not sure on this, but given your explanation, which sounds plausible, have you looked at all at proc_open:
http://us3.php.net/proc_open
Ruby's open-uri requires tempfile, so I'm guessing there's a file ownership conflict between you running your ruby script and the web server running it. Can the web server create a temp file using tempfile?
Just an FYI, I never really uncovered why this was happening. The best I could deduce was that some type of permission issue was preventing Ruby's open-uri commands from working properly.
I opted for queuing these jobs in a db table and running my ruby script via cron periodically. Everything seems to work fine when the ruby script runs with root/sudo perms.
Run on Linux terminal:
sudo -H -u <user> bash -c <your code> where <user> is the Apache's user.
To find Apache's user you can echo("shell_exec(\"whoami\")"); inside your code and run it on browser. whoami works on Linux and Windows, but if you're under Windows, the Apache default user is your user. You can test it anyway in case it's different, but I can't tell how to run the code on Windows like if it's Apache running it.
After that you can have a clue of what's happening. In most cases the problem is the Apache's root folder is different from operating system's folder. So when you run a command with absolute path, the OS consider / and Apache consider /var/www/html on Linux, /opt/lampp/htdocs on Xampp(Linux) and C:/xampp/htdocs on Xampp(Windows). You get the idea i think.