So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.
Related
I am trying to run a LaTeX parser from a dokuwiki website. The LaTeX parser relies on the mimetex cgi module, it is executed from a PhP script using the following command:
shell_exec('/var/www/cgi-bin/mimetex.cgi -f /var/www/html/test1.tex -e /var/www/html/test1.gif');
However, the command is not properly executed when running the PhP script. I ran a test command such as:
shell_exec('cp test1.tex test2.tex');
and was able to check that the file test2.tex was indeed created. I also ran the command:
/var/www/cgi-bin/mimetex.cgi -f /var/www/html/test1.tex -e /var/www/html/test1.gif
from a terminal, and it was executed without error. I also made sure that my apache server is configured so that it can execute cgi scripts following https://www.ionos.com/community/server-cloud-infrastructure/apache/enable-cgi-scripts-on-apache/
I am missing something here... but I don't see what...
You could try to execute the cgi script via a GET request, so it is executed by Apache. A simple way to do this AFAIK:
$response = http_get("http://localhost/cgi-bin/mimetex.cgi");
But you can find other ways also, like using curl.
You will need to pass the tex file as a query parameter, appending it like ?texfile=test1.tex, with proper param name.
One may start a local PHP server, e.g for testing:
php -S localhost:8080
One can also execute a PHP statement, e.g.
php -r "echo 'Hello';"
We initially hoped we could use this to tell when the server was started, i.e. using systemd-notify or some other process readiness protocol. However, using -r and -S together seems to ignore -r.
My question is thus, when starting a local server using php -S, is it possible to execute some code after the server is ready to receive incoming connections? This would allow us to execute something like systemd-notify --ready and enable the parent process to know when to proceed with testing.
Is there any reason why I can not complied files in PHP's shell_exec/exec/system function?
Example of something that does work in command line and PHP's shell_exec function:
<?php
$data = shell_exec("ls");
echo $data;
?>
Example of something that does not work in PHP's shell_exec function but will work in command line (I can confirm that):
<?php
$data = shell_exec("./c-compiled-file argv1 argv2 argv3");
echo $data;
?>
Is there anything I can do on my server so this will work? I've looked everywhere and no solutions I found fixed the problem. The compiled file is in the same directory as the PHP script as well, it just won't execute it. Also if you're asking, yes I have tried this with SSH2 and it still will not execute.
Also PHP is not in safe mode and NO functions are disabled.
Some common glitches when executing external commands from PHP that work fine from shell:
Command uses relative paths but PHP is launched from an arbitrary location:
Use getcwd() / chdir() to get/set working directory
PHP and shell run with different user credentials. This is often the case when PHP runs through a web server.
PHP and shell run different commands. Many people call stuff like exec("foo $bar") and doesn't even check what "foo $bar" contains.
No error checking is done. The bare minimum is to capture and print standard output, standard error, status code and, of course, all PHP error messages including warnings and notices.
You can redirect stderr to sdtout
You can use a PHP function that allows to capture more information, such as exec()
The web server is disallowed to execute the command at operating system level.
Lookout for SELinux or similar tools.
Just a guess, but the binary you're trying to execute might not have the proper permissions. Prepeding it with ./ in the command line forces it to execute, but PHP probably strips that for security purposes. Try this:
chmod +x c-compiled-file
You want to use system in the second case, and not shell_exec.
system executes an external program and displays the output.
shell_exec executes a command via shell and returns the complete output as a string.
and for good measure:
exec simply executes an external program.
Furthermore you want to make sure your external program is executable and (though you have stated it, I'll restate this) has execute permissions for the user which is running the web server. You also want to make sure the directory your external program is running in has the ability to write to its directory or /tmp or whatever output directory you have set.
Finally you should always use absolute paths for executing things like this in cron or php or whatever... so don't use ./c-compiled-file argv1 argv2 argv3, but instead use /home/username/c-compiled-file argv1 argv2 argv3 or whatever the full path is.
Here's my code to create a new at job...
system('echo \'php -f /path/to/my/php/file.php\' | at 1700');
I thought this would be simple and would just work but alas, nothing happens!
When I run echo \'php -f /path/to/my/php/file.php\' | at 1700 via ssh everything works as expected.
Is this a permissions problem? I.e PHP isn't allowed to create new at jobs?
Just FYI you have to make sure that you are allowed to execute system commands from your PHP scripts.
A lot of servers have this feature disabled.
However, if you want to turn this back on I believe you can do so in the php.ini file, you would have to turn safe_mode to off.
When allowing user-supplied data to be passed to this function, use escapeshellarg() or escapeshellcmd() to ensure that users cannot trick the system into executing arbitrary commands.
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Note: When safe mode is enabled, you can only execute files within the safe_mode_exec_dir. For practical reasons, it is currently not allowed to have .. components in the path to the executable.
I have a Ruby script that's being used to do some API calls/screen scraping, but our main app is in PHP. Our PHP app is using shell_exec() to call the Ruby script.
The ruby script works great when called from the command lineābut it will randomly exits early when called via PHP's shell exec.
Here's an example of the Ruby script:
#!/usr/bin/env ruby
require 'rubygems'
require 'mysql'
require 'net/http'
require 'open-uri'
require 'uri'
require 'cgi'
require 'fileutils'
# Bunch of code here ... works fine
somePath = 'http://foo.com/bar.php'
# Seems to always exit when I do a Net::HTTP or open-uri call
post = Net::HTTP.post_form(URI.parse(somePath),{'id'=>ID,'q'=>'some query'})
data = post.body
# OR
data = open(somePath).read
# More code here ...
So, all I can deduce so far is that it's always exiting when I try to grab/read an external URL via net/http or open-uri calls. The pages I'm grabbing can accept POST or GET requests, but it seems to be exiting either way.
I'm outputting the results with PHP after the shell_exec call, but there are no error messages or exits. I do have messages being output by my Ruby script with "puts ...." here and there. Could that be a problem (I'm thinking 'no' because it doesn't exit with earlier puts messages)?
Again, it works fine when called from the shell. It's almost like the shell_exec call isn't waiting for the net/http call to finish.
Any ideas?
I'm not sure on this, but given your explanation, which sounds plausible, have you looked at all at proc_open:
http://us3.php.net/proc_open
Ruby's open-uri requires tempfile, so I'm guessing there's a file ownership conflict between you running your ruby script and the web server running it. Can the web server create a temp file using tempfile?
Just an FYI, I never really uncovered why this was happening. The best I could deduce was that some type of permission issue was preventing Ruby's open-uri commands from working properly.
I opted for queuing these jobs in a db table and running my ruby script via cron periodically. Everything seems to work fine when the ruby script runs with root/sudo perms.
Run on Linux terminal:
sudo -H -u <user> bash -c <your code> where <user> is the Apache's user.
To find Apache's user you can echo("shell_exec(\"whoami\")"); inside your code and run it on browser. whoami works on Linux and Windows, but if you're under Windows, the Apache default user is your user. You can test it anyway in case it's different, but I can't tell how to run the code on Windows like if it's Apache running it.
After that you can have a clue of what's happening. In most cases the problem is the Apache's root folder is different from operating system's folder. So when you run a command with absolute path, the OS consider / and Apache consider /var/www/html on Linux, /opt/lampp/htdocs on Xampp(Linux) and C:/xampp/htdocs on Xampp(Windows). You get the idea i think.