So far my search has shown the potential security holes that will be made while trying to perform a sudo'd command from within PHP.
My current problem is that I need to run a bash script as sudo on my work web server via PHP's exec() function. We currently host a little less than 200 websites. The website that will be doing this is restricted to only be accessible from my office's IP address. Will this remove any potential security issues that come with any of the available solutions?
One of the ways is to add the apache user to the sudoers file, I assume this will apply to the entire server so will still pose an issue on all other websites.
Is there any solution that will not pose a security threat when used on a website that has access restricted to our office?
Thanks in advance.
Edit: A brief background
Here's a brief description of exactly what I'm trying to achieve. The company I work for develops websites for tourism related businesses, amongst other things. At the moment when creating a new website I would need to setup a hosting package which includes: creating the directory structure for the new site, creating an apache config file which is included into httpd.conf, adding a new FTP user, creating a new database for use with the website CMS to name a few.
At the moment I have a bash script on the server which creates the directory structure, adds user, creates apache config file and gracefully restarts apache. That's just one part, what I'm looking to do is use this shell script in a PHP script to automate the entire website generation process in an easy to use way, for other colleagues and just general efficiency.
You have at least 4 options:
Add the apache user to the sudoers file (and restrict it to run the one command!)
In this case some security hole in your php-apps may run the script too (if they can include the calling php for example - or even bypass the restriction to your ip by using another url that also calls the script, mod_rewrite)
Flag the script with the s bit
Dangerous, don't do it.
Run another web server that only binds to a local interface and is not accessible from outside
This is my prefered solution, since the link calling the php is accessible by links from your main webserver and the security can be handled seperately. You can even create a new user for this server. Some simple server does the job, there are server modules for python and perl for example. It is not even necessary, that you enable exec in your php installation at all!
Run a daemon (inotify for example, to watch file events) or cronjob that reads some file or db-entry and then runs the command
This may be too complex and has the disadvantage, that the daemon can not check which script has generated the entry.
Related
Just as the question says... I've read up a few articles, others says just don't do it, but yet fail to mention a safe way. I know it hazardous to give it sudo access or root, but I was thinking about running a script that has root access through root.
One post was talking about a binary wrapper, but I did not fully understand it when I attempted it and when I tried to do a search to understand I didn't find anything that explain it well.
So, what would be a good-safe way? I don't even need to have a detailed explanation. You can just point me to a good source to start reading.
Thanks.
Specs:
Ubuntu Server 14.04
EDIT:
Commands I am talking about is mkdir, rmdir with an absolute path. Create user, remove user (which is why I need root) and edit some Apache files for me.
They fail to provide a safe way because, IMHO, there isn't one. Or, to put it another way, are you confident that your code that protects the create user and add user functions is cleverer than the hackers code that tries to gain access to your system via the back door you've built?
I can't think of a good reason for a web site to create a new system-level user. Usually web applications run using system users that are created for them by an administrator. The users inside your web site only have meaning for that web site so creating a new web site user gains that user no system privileges at all. That said, it's your call as to whether you need to do it or not.
In those cases where system operations are necessary a common approach is to build a background process that carries out those actions independently of the web site. The web site and that background process communicate via anything that works and is secure - sockets, a shared database, a text file, TCP-IP, etc. That separation allows you to control what actions can be requested and build in the necessary checks and balances. Of course it's not a small job, but you're not the first person to want to do this so I'd look for an existing tool that supports this administration.
I have not been able to find solid information on preferred (best practices) and/or secure methods to allow php to access config or other types of files on a linux server not contained in the public web directory or owned by the apache user so I'm hoping to find some answers here.
I am a fairly competent PHP programmer but am increasingly tasked with writing web applications (most of which are not publicly accessible via the web however) that require updating, changing or adding to config files or files generated by some service or application on the server.
For instance, I need to create a web interface that will view, add or remove entries from a /etc/mail/spamassassin/white-list.cf file owned by root.
Another scenario is that I need php to parse mime messages in /var/vmail that are owned by user vmail.
These are just a couple examples, there will be other files in locations owned by other processes/users. How can I write PHP applications that securely access and manipulate these files without opening security risks?
If I were needing to implement something like this, I would probably look at using something like sudo to fine-tune permissions. I'm not a Linux CLI expert, so I'm sure there are issues that I haven't taken into account when typing this out.
I would probably determine what tasks need to be done, and would write a separate script for each task that needs to be completed. Using sudo, I'd assign the necessary level of permissions for that script only.
Obviously, as the number of tasks increase, so would the complexity and the amount of work involved. I'm not sure how this would affect you at the moment.
I am currently hosting a Drupal 6 site on a CentOS machine. The Drupal (CMS) configuration contains a few dozen third-party modules that should not be forked as a general best coding practice. However, some of these modules make use of the php exec command in order to function properly.
The site allows for admins to embed php code snippets in any page via a UI configuration, granted that they have access to the php code input format. I need to keep this input format available to admins because there are several nodes (pages) and panel panes that make use of small, harmless php code snippets, like embedding a specific form into the content region, for example.
The issue is that if someone were to compromise an admin account, then they could run arbitrary php code on the site, and thus run shell commands via php's exec, passthru, etc. Is there any way, from an operating system level, to restrict what shell commands php can pass through to the machine? Could this be done via restricting file permissions to some programs from php?
Note: I cannot use the php.ini disable_functions directive as I still need exec to function normally for many cases, where modules make use of certain shell commands, like video encoding, for example.
If your admininstrator accounts get hacked, you are doomed. You are trying to be less doomed, but this will not work.
Disabling exec()
This is only a fraction of all the functions that are able to make calls to the system. There are plenty more, like passthru(), shell_exec(), popen(), proc_open(), the backtick operator, and so on. Will not help.
Restricting available executables
Would work only if the attacker cannot bring his own executables. But file_put_contents() will be able to write the executable to the harddisk, and it can then be called. Will also not help.
PHP cannot do any harm itself, can it?
Wrong. Executing stuff on the server via exec() might seem the best idea, but PHP itself is powerful enough to wreak havoc on your server and anything that is connected to it.
I think the only real solution is:
Do not allow your admin accounts to get hacked.
And if they get hacked, be able to know it immediately. Be able to trace back an attack to an administrator. Be able to know what exactly the attacker did to your machine so that you might be able to undo it. A very important part is that you implement an audit trail logger that saves it's data on a different machine.
And you'd probably implement tighter restrictions on who can login as an administrator in the first place. For example, there probably is no need to allow the whole worlds IP range to login if you know for sure that one certain admin always uses the IP range of his local ISP to work. At least, ring a bell and inform somebody else that a login from china is going on if this is not expected (and unless you are operating in china :-) ).
And there is two factor authentication. You could send an SMS to the phone number with an additional login code. Or you might be able to completely outsource the login by implementing Google or Facebook authentication. These players already have the infrastructure to support this.
And additionally, you get higher resistance against inside jobs. People do value their personal social network logins higher than the login for their employer. Getting someones facebook password will cost 30$ on average, but the company login is already shared for 8$. Go figure...
To answer your question: in theory, if you created a user account using an extremely restricted account that PHP could run commands as, you could tune your installation to be more secure. However, the real problem is that administrator users are able to execute arbitrary commands. If that happens, you will have a significantly larger problem on your hands.
The real solution here is:
The ability to submit and run arbitrary code from within Drupal is a significant risk that you can mitigate by not doing it. I strongly recommend re-designing those "harmless" bits of code, as they will lead to a compromise; arbitrary code execution is only one kind of exploit and there are many others to worry about.
The modules that require running shell commands are also significant security vulnerabilities. In some cases, I've been able to fork/patch or replace modules executing commands with ones that don't, but in some cases (e.g. video encoding) it cannot be avoided. In that situation, I would set up a very restricted backend service that the frontend can communicate with. This separates your concerns and leaves Drupal to do what it was intended to: manage and serve content.
This isn't at the OS level, but inside the Drupal core there is a module called PHP. You could use this as a base and create a custom module that extends the functionality of this module, and then simply enable this module as opposed to the Drupal 6 core module. The big problem with this however comes with disabling the Drupal 6 core module and then enabling your new module. I'd test it on a dev install to make sure previous content is not deleted and that the new module correctly parses stored PHP input. It should be alright as the module install only has a disable hook warning against PHP content now showing in plain text.
As for extending the core module, it's a very simple module to start. You could hardcode in a list of allowed or not-allowed commands to execute. You can then check the exec statement with resolved variables against this list and do whatever is appropriate. It's not a perfect match against simply blocking at the OS level programs themselves, but it's better than nothing. To hardcode a list, you'll simply want to modify the php_filter hook at the bottom of the module file and before doing a drupal_eval, do your test.
You could also extend this module to be configurable in the Drupal admin interface to create that list instead of hardcoding it. Hope this idea helps.
Another approach:
We believe that we need to create a test user that only has access to the system to perform a telnet to another machine on the network. Since we only need to run a telnet need to restrict the other commands available in a standard bash session. Let's go step by step configuring everything.
1) We create user test
This will be a regular user of the system, so should we as a normal user. The only peculiarity is that we change the shell of that user. The default is usually / bin / bash and we will set / bin / rbash. rbash is actually a copy of bash, but it really is a "restricted bash".
shell> adduser --shell /bin/test rbash
2) We create the file. Bash_profile
We must create this file in the user's home that was created and for which we want to apply the permissions. The contents of the file will be as follows,
if [-f ~/.bashrc]; then
. ~/.bashrc
fi
PATH = $HOME/apps
export PATH
3)We avoid changes
Once you have created the file, we stop nobody can make changes to the file.
shell> chattr +i /home/test/.bash_profile
4) We create the apps directory and install the programs 'access'
Now once you have everything set up and only have to create the apps and inside it, create a link to the programs you want the user to have permissions. All programs that are within apps, the user can run the but, no.
shell> mkdir apps
shell> ln-s /usr/bin/telnet /home/test/apps/
5) We found that works
Now you can access the system and verified that it works correctly.
shell> ssh test#remote
test#remote password:
shell#remote> ls
-rbash: ls: command not found
shell#remote> cd
-rbash: cd: command not found
shell#remote> telnet
telnet>
Team:
Here's my approach.....
I created a small script with a list of accepted commands, and depending on the command, the execution will be controlled for avoiding issues. I'm not sure if it is in the scope of the question. The sample code have commands in Windows, but you can use it in Linux as well...
<?php
ini_set('display_errors',1);
error_reporting(E_ALL);
function executeCommand($my_command){
$commandExclusions = array ('format', 'del');
if (in_array(strtolower($my_command),$commandExclusions)) {
echo "Sorry, <strong>".$my_command." </strong> command is not allowed";
exit();
}
else{
echo exec($my_command, $result, $errorCode);
implode("n", $result);
}
}
echo "<h3>Is it possible to restrict what commands php can pass through exec at an OS level?</h3><br>";
echo "********************************<br>";
//test of an accepted command
echo "test of an accepted command:<br>";
executeCommand("dir");
echo "<br>********************************<br>";
echo "test of an unaccepted command:<br>";
//test of an unaccepted command
executeCommand("format");
echo "<br>********************************<br>";
?>
Output:
Is it possible to restrict what commands php can pass through exec at an OS level?
test of an accepted command:
117 Dir(s) 11,937,468,416 bytes free
test of an unaccepted command:
Sorry, format command is not allowed
I'm attempting to build an application in PHP to help me configure new websites.
New sites will always be based on a specific "codebase", containing all necessary web files.
I want my PHP script to copy those web files from one domain's webspace to another domain's webspace.
When I click a button, an empty webspace is populated with files from another domain.
Both domains are on the same Linux/Apache server.
As an experiment, I tried using shell and exec commands in PHP to perform actions as "root".
(I know this can open major security holes, so it's not my ideal method.)
But I still had similar permission issues and couldn't get that method to work either.
But I'm running into permission/ownership issues when copying across domains.
Maybe a CGI script is a better idea, but I'm not sure how to approach it.
Any advice is appreciated.
Or, if you know of a better resource for this type of information, please point me toward it.
I'm sure this sort of "website setup" application has been built before.
Thanks!
i'm also doing something like this. Only difference is that i'm not making copies of the core files. the system has one core and only specific files are copied.
if you want to copy files then you have to take in consideration the following:
an easy (less secured way) is to use the same user for all websites
otherwise (in case you want to provide different accesses) - you must create a different owner for each website. you must set the owner/group for the copied files (this will be done by root).
for the new website setup:
either main domain will run as root, and then it will be able to execute a new website creation, or if you dont want your main domain to be root, you can do the following:
create a cronjob (or php script that runs in a loop under CLI), that will be executed by root. it will check some database record every 2 minutes for example, and you can add from your main domain a record with setup info for new hosted website (or just execute some script that gains root access and does it without cron).
the script that creates this can be done in php. it can be done in any language you wish, it doesn't really matter as long as it gets the correct access.
in my case i'm using the same user since they are all my websites. disadvantage is that OS won't create restrictions, my php code will (i'm losing the advantage of users/groups permissions between different websites).
notice that open_basedir can cause you some hassle, make sure you exclude correct paths (or disable it).
also, there are some minor differences between fastCGI and suPHP (i believe it won't cause you too much trouble).
I just needed to know that is it possible in php to create an ftp user, and then create folders on the server and grant ftp access to selected folders for the ftp user created.
Thanks again!
Native PHP can not do this. The task is way out of PHP's scope.
Depending on the server OS and FTP server software used, however, PHP could call some shell scripts (or WMI / PowerShell scripts on Windows) that accomplish the task. This is not trivial to set up, though, especially not if it's to be done safely (without giving the PHP process root level privileges).
The question may be better suited on Serverfault.com.
There are a few web hosting panels written in PHP that crate ftp accounts among other things so it's definitely possible.
The exact procedure depends completely on the FTP server you use. It may involve creating new Unix user accounts.
This is more an FTP or operating system question than a PHP question though as you need to shell out to do the configuration. As Pekka said you may have more luck asking on Serverfault if you include the details of your setup.
No but if I'm not mistaking you could do something like this
Create a shell script (ftp.sh) that's has SUID (make sure it's owned by root and only can be read/written by root) that creates users, sets the permissions, etc
Call the script from php
system("./ftp.sh ".escapeshellarg($newUsername)." ".escapeshellarg($newPassword))
However I'm pretty sure there are more secure/correct ways of doing this. I can definitely see this becoming a security nightmare.
The answer is "Yes" if the web process where the script runs allows changes on the FTP settings e.g adding users, group etc. either by native PHP function or additional "Shell script" and it would be "No" if the web process doesn't have access nor privilege to make changes.