How can I disable the dangerous eval function? Can that be done using ini_set function?
Also how to disable following functions? Can we disable them using ini_set function?
allow_url_fopen
allow_url_include
exec
shell_exec
system
passthru
popen
stream_select
eval is one of the most dangerous function that bad guys can use to exploit the things. There should be a mechanism to disable that without resorting to php.ini file; but is should be done programatically.
Well, guys I am looking for an answers suggesting disabling of these dangerous lovely fellows without going to php.ini file; I mean how to disable them at runtime or programatically?
Thanks in advance....
Update
Has anyone heard about PHP Shell Offender Script? It mainly used the eval function for the exploit. Hackers are able to run their PHP code on your site.
My question was that I don't want to disable the eval function from php.ini file altogether. For example, i have developed my own MVC framework. Now the framework users can specify from frameworks config file whether eval (and others) function should be disabled or not. So this is left to the choice of framework users. Once they specify to disable it; i should be able to disable the eval function programatically.
So that is the scenario. Looking for helpful answers/solutions.
Thanks Again.
Afraid you're pretty much stuck using php.ini to disable most of those. However, it gets worse. eval() is technically not a function, it is a language construct, so it CANNOT be disabled using disable_functions. In order to do that, you would have to install something like Suhosin and disable it from there.
A good webmaster should consider a security review to be an essential part of site setup. Do not try to completely abstract this away, people are lazy enough about security already. If you are going to use tools (like a webhost), you should take the initiative to have at least a cursory knowledge of how to manage one responsibly.
That said, there are some other things you can do to severely cripple most hack attempts, including:
-Disable base64_decode() using disable_functions. Now, there are ways around this, however the vast majority of hack scripts are generic in nature, and this will break about 95% of them as they require the existence of BOTH of these functions in order to operate properly. This does not mean that your server cannot be hacked, but in most cases it would incur the overhead of manually sniffing your server for vulnerabilities, and most hackers are playing the numbers and ain't got time for that (NOTE: some hackers do have time for that, this is not a magic bullet by itself).
-Filter all input for common other exploit string patterns like <?php, which is frequently used to squeak by an opening php tag unnoticed. There are several such patterns. Best practice is to whitelist specific characters and reject all others on a per-input basis. At the very least, filter the aforementioned, null terminators, and possible sql injection strings such as '; -- (do not assume that simply using pdo or mysqli is going to filter ALL injection attempts, there are still some ways to pull this off even if you are properly using prepared statements).
-Any directories that serve only media should have all script access disabled, and all uploads and media should be placed only in such a directory. It is better to whitelist only the acceptable media rather than blacklist scripts, as there are any number of ways to execute a script file (eg: php, php5, phtml, etc) which individually may or may not be available on any given server environment. You can do this with a simple .htaccess placed in the media directory similar to this:
php_flag engine off
AddHandler cgi-script .php .php3 .php4 .phtml .pl .py .jsp .asp .aspx .htm .html .shtml .sh .cgi
Options -Indexes -ExecCGI
<Files "\.(jpe?g|png|gif|bmp|tiff|swf|flv|mov|avi|mp4)$">
order deny,allow
deny from all
</Files>
This part can be dynamically written by php, so your application would be capable of securing sensitive directories in a manner similar to this, which can mitigate a great deal of hacker pain, as this is typically overlooked. I typically add a similar .htaccess to almost every Wordpress site I work on in the uploads directory, and have often wondered why this is not done out of the box, as it blocks a great deal of hack attempts and does not interfere with the application in any way that I have noticed.
Unfortunately, if you are not on an apache server, you will need to find another solution (on IIS there is most likely an equivalent, but I am not aware of what it would be personally).
-You should also configure your .htaccess (or web.config/etc) to disable any access methods that are not needed for your specific application. If you are not doing RESTful web services, there is really no reason to allow PUT or DELETE, you should almost certainly also disable TRACE, and probably also don't really have any reason to leave OPTIONS or HEAD enabled either. It should also be mentioned that all non-recognized connection methods by default resolve to GET, which means that from the command line I can do something like:
curl -X BOOGITY -d arg=badstuff -d arg2=morebadstuff yoursite.com
In this example, BOOGITY is meaningless, however, your server will interpret this as:
curl -X GET -d arg=badstuff -d arg2=morebadstuff yoursite.com
However your application likely will not.
In order to prevent this, you should configure your server to accept only GET as GET, and not allow it to be the default.
In most cases, the primary point is not to make it difficult to execute specific php patterns in your environment, the point is to prevent the inclusion of rogue code (either locally or externally) so it does not become an issue. If you are allowing the installation of modules or such into your CMS, sloppy programmers WILL eventually create exploits, which you cannot really do much about aside from enforcing pretty stringent API parameters that make it very difficult to do it poorly, but it can never be made impossible. Never underestimate the capacity of an offshore hack shop or self proclaimed "php ninja" to diligently work with your system in the most insecure or non-compliant way possible, create massive vulnerabilities, and to invent any number of roundabout hacks to do so that are actually harder to pull off than just doing it the right way.
/security rant.
To disable functions, mainly for security reasons, you can use the disable_functions directive in your php.ini configuration file.
But, as the documentation states :
This directive must be set in php.ini
For example, you cannot set this in
httpd.conf.
I suppose this is too "internal" to be configurable anywhere else than in PHP... And as it's security related, it's up to the system administrator to configure it.
Still, the best security measure is to write clean/secure code, filter all input, escape all output... And not let anyone run their own code on your server !
In short: you can't do that.
But I think you don't really understand how eval and those functions are exploited. The problem occurs when programmers do not sanitize properly the ARGUMENTS that are passed to them.
The php shell offender script that you mentioned is just a simple PHP script passing arguments to those functions. But the attackers already had a way of injecting/uploading the malicious script. If you are not using these functions at all nor passing arguments from user input, attackers can't run arbitrarily code on your server using eval() or relatives.
Are you going to be hosting this framework for your users? If you are allowing users to upload and run code then you have a bigger problem in your hands.
Read about remote code execution and remote/local file inclusion here to learn more about attacks related to this: Common PHP vulnerabilities
The disable_functions directive is only available in the php.ini configuration.
To disable functions at runtime wouldn't make much sense, since you would be able to modify the disabled function list at runtime to re-enable functions as well.
You can disable eval through https://github.com/mk-j/PHP_diseval_extension and gets around the issue of suhosin not being php7 compatible/stable.
Add this line to your php.ini (you Search for 'disable_functions')
disable_functions =exec,passthru,shell_exec,system,proc_open,popen,curl_exec,curl_multi_exec,parse_ini_file,show_source
Then restart your php service (apache or php-fpm)
Related
I have users upload PHP files to my server (I know this is a security risk, but it must be done).
I might have to execute the PHP scripts on the server.
So I was wondering, is there a way I can deny those PHP scripts access to any files and any directories outside of their current folder? This would make it secure enough for me to use.
Thanks.
Sigh... "but it must be done" - says whom?
Some options might exist: Looking at the comments on the documentation for the (now-removed) PHP Safe Mode, I found a link to suPHP which "is a tool for executing PHP scripts with the permissions of their owners". This would require local UNIX accounts for each user though - which I'm not sure is possible in your situation.
A real solution would need to go much deeper. I was once on a website that allowed you to compile and run applications in just about any language, as part of an exam. By compiling some "interesting" programs, I was able to determine that I was actually running in a QEMU VM "jail", and they were somehow funneling IO to/from the VM via my HTTP connection.
But the right answer is probably, of course, don't do it. With more information as to what exactly you're designing, we might be able to offer more sane alternatives.
You could set up a chrooted environment for these scripts to run in. Not absolutely waterproof, but a lot better than potentially giving access to your entire filesystem.
This article contains lots of info on how to get certain services running correctly inside your chrooted environment, it also contains a link to a best practices document concerning the correct usage of chroot.
Now, php also has a chroot command, it might be possible to tinker some kind of "sandbox" that's "good enough" for your purposes by using that function.
Anyways, although chroot can help tremendously to protect your system during the execution of foreign code you should remain very careful, and the basic rule is to provide as little services and facilities inside the chrooted environment as possible. In that context, the SO article pointed to by Emilio Gort contains a (very long) list of exploitable functions, probably most or all of these should be blocked by using the disable_functions setting in php.ini
While I do know that system calls and security don't go hand in hand, there is a project for which I do need it. I'm writing a small code checker and I need to compile and execute the user submitted code to test against my test cases.
Basically I want to run the code in a sandbox, so that it can't touch any files outside of the temporary directory and any files that it creates can't be accessed by the outside world.
Recently I came across an exploit with with which the user could create a file say shell.php with the following contents.
<?php
echo system($_GET['x']);
?>
This gives the attacker a remote shell and since the owner of the file is apache, the attacker could basically move around my entire /var/www where mysql passwords were stored along with other configuration information.
While I am aware of threats like SQL Injections and have sanitized the user input before any operations that involve the DB, I have no idea as to how I can set up the sandbox. What are the techniques that I can use to disable system calls (right now I'm searching for the word 'system' in the user submitted code and not executing those snippets where it is found) and restrict the access to the files that the user submitted code creates.
As of now my code checker only works for C and I plan to add support for other languages like C++, Java, Ruby and Python after I can secure it. Also I'd like to learn more about this problem that I've encountered so pointers to a place where I could learn more about web security would also be appreciated.
My development machine is running Mac OS Lion and the deployment machine is a linux server so if a solution, that was cross platform would be most appreciated but one that dealt with just the linux machine would do too.
What you will probably want to do is set up a chroot to some random temp directory on your filesystem for the user running your scripts. Here is some reading on setting up a chroot, and some security to-know's.
I would suggest you also install a security module such as suExec or MPM-iTK for Apache. Then, within your Apache's VirtualHost (if you are not running a virtual host, do so!), assign a specific UserID to handle requests for this specific VirtualHost. This separates the request from the Apache default user, and adds a little security.
AssignUserID nonprivilegeduser nonprivilegeduser
Then, harden PHP a little by setting the following PHP options so the user cannot access files outside of the specific directories, and move your tmp_dir and session_save_path within this directory. This will prevent the users access outside of their base directory.
php_admin_value open_basedir /var/www/
php_admin_value upload_tmp_dir /var/www/tmp
php_admin_value session.save_path /var/www/tmp
Along with the lines of PHP, prevent access to specific functions and classes, and read up on PHP's security write-up.
Also, I would have you look into for that user, disabling access to sudo and su, to prevent a script from attempting to access root privileges. Learn more, here.
All in all, you said it nice and clear. There is no way to fully prevent a user from accessing your system if they have the will. The trick is to just make it as difficult as possible, and confusing as possible for them.
There is no way to make this work on a cross-platform basis, period. Sandboxing is inherently highly system-specific.
On Mac OS X, there is the Sandbox facility. It is poorly documented, but quite effective (Google Chrome relies heavily on it). Some enterprising souls have documented parts of it. However, it's only available on Mac OS X, so that probably rules it out.
On Linux, your options are considerably less developed. Some kernels support the seccomp mechanism to prevent processes from using any except a small "safe" set of system calls; however, not all do. Moreover, that "safe" subset doesn't include some calls that you are likely to need in code that hasn't been specifically written to run under seccomp -- for instance, mmap and sbrk are not permitted, so you can't allocate memory. Helper tools like seccomp-nurse may get you somewhere, though.
Here are the things I suggest doing.
1.
Disable system classes and functions in php.ini with
disable_functions="system,curl_init,fopen..."
disable_classes="DirectoryIterator,SplFileObject..."
2.
Run in a read only environment with no important data stored on it. In case anyone ever got into your server you don't want them to access anything. A good way to do this is buy an Amazon AWS EC2 and use a jailed user to run your server and PHP.
3.
Ask people to break it. The only way you can find flaws and loop holes that you are unaware of is to find them. If necessary, get a temporary server with a "test" application that replicates the same type of application your "production" environment will be.
Here are some helpful resources.
List of functions to disable. It can definitely be expanded upon but it's a good start.
Information on to avoid security issues.
The source and README in Viper-7's Codepad
I think what you are looking for is Mandatory Access Control. In Linux, it is available via SeLinux. Using it you can limit who can execute what command. In your case, you can limit the php user (Apache) to execute only limited commands like gcc etc. Also look into AppArmor
Also, look into runkit php virtual environment
You can try to run user submitted code in a container (docker) which are very light weight VMs. They start in less than a second.
I've got a sticky question in my mind: safe_mode has removed in PHP 5.4, so what is the security in this removal?
Does it mean that any application can execute any program?
What technique is used for this purpose to prevent such violent actions?
This article Will explain you why safe_mode has never made a single bit of sense and only provides you a false sense of security.
safe_mode was trying to solve a security problem with the wrong tool. Since shared webhosts often host thousands of websites on one server, safe_mode was a convienent (and entirely inappropriate) method to restrict the damage one could do with PHP.
It was an illusion more than anything else. Though PHP may have been protected with safe_mode, what about other languages like Python and Ruby? The proper method is to use default linux file permissions and modules like suPHP which run PHP as restricted users.
I've been testing security for some php scripts and have found that, among other things, suhosin strips away a posted variable that is huge... this is fine and desirable, but I'd like for my script to be able to tell that suhosin changed the request.
Does suhosin leave any fingerprints to indicate that some action was taken -- in a way that the script can detect? I'm guessing it can't trigger something like an E_USER_WARNING, because that would be thrown before the script is running and could catch it. Maybe an environment or special global variable?
I tried a few approaches myself, but didn't see anything... perhaps suhosin needs to be configured to do this? I find the suhosin documentation to be, um, difficult to understand.
Yes it does, not fingerprinting, but logging: Suhosin Logging Configuration.
Suhosin's input filter is designed to filter out potentially dangerous payload, e.g. too big requests, transparently. If a script were able to detect this filter and change its program flow based on this information, it would be much easier for an attacker to circumvent the filter.
As a recommendation, filter limits should be set as strict as possible, but as broad as necessary. Your script is supposed to run without being able to detect Suhosin's presence.
I noticed, that sometimes (especially where mod_rewrite is not available) this path scheme is used:
http://host/path/index.php/clean_url_here
--------------------------^
This seems to work, at least in Apache, where index.php is called, and one can query the /clean_url_here part via $_SERVER['PATH_INFO']. PHP even kind of advertises this feature. Also, e.g., the CodeIgniter framework uses this technique as default for their URLs.
The question: How reliable is the technique? Are there situations, where Apache doesn't call index.php but tries to resolve the path? What about lighttpd, nginx, IIS, AOLServer?
A ServerFault question? I think it's got more to do with using this feature inside PHP code. Therefore I ask here.
Addendum: As suggested by VolkerK, a reasonable extension to this question is: How can a programmer influence the existence of $_SERVER['PATH_INFO'] on various server types?
I think this a question which is equally suited for stackoverflow and serverfault. E.g. I as a developer can only tell you that pathinfo is as trustworthy as any user-input (which means it can contain virtually anything) and your script may or may not receive it depending on the webserver version and configuration:
Apache: AcceptPathInfo
IIS: e.g. AllowPathInfoForScriptMappings and others
and so on and on...
But server admins possibly can tell you which settings you can expect "in the real world" and why those settings are preferred.
So the question becomes: How much influence do you (or the expected userbase) have on the server configuration.
AcceptPathInfo needs to be enabled in order to have this working.
From my experience I'd say PATH_INFO is usually available in normal web hosting environments and server setups - even on IIS - but on rare occasions, it is not. When building an application that is supposed to be deployable on as many platforms as possible, I would not trust path_info on a hard-coded level.
Whenever I can, I try to build a wrapper function build_url() that, depending on a configuration setting, uses either
the raw URL www.example.com/index.php?clean_url=clean_url_here
the path_info mechanism www.example.com/index.php/clean_url
mod_rewrite www.example.com/clean_url
and use that in all URLs the application emits.
There might be naive scripts (auto-linking for example) that do not recognize this URL's format. Thereby decreasing the chance that links to your content will be created.
Since home-grown regular expression patterns are common for these tasks, the chance of failure is quite real.
Technically, those URLs are fine. SEO-wise, they are 'less perfect'.