Disable PHP execution in one directory - php

In order to prevent PHP local file inclusion attacks I want to disable the execution of all PHP files in one directory completely. Using the line php_flag engine off within the .htcaccess file will cause a 500 error. According to another question this is due to the way PHP is installed.
Is there any other way to prevent PHP execution if the PHP installation cannot be altered?
Update: The files don't neccessarily have the .php ending.

Add this to your .htaccess file
<FilesMatch \.php$>
SetHandler None
</FilesMatch>

You're building your site via "allow all, then deny" logic. You should build it with "deny all, then allow" logic. For example, you're telling Apache to serve all files a particular directory and then you're overriding that config to tell Apache to not serve some files in that directory. I.e., you probably have something like this:
<VirtualHost *:80>
ServerName foo.com
DocumentRoot "/path/to/files"
</VirtualHost>
With a directory layout like this:
/path/to/files
index.php
config.php
/path/to/files/lib
db.php
etc.php
other_thing.php
With this setup, anybody can request http://foo.com/config.php or http://foo.com/lib/etc.php directly, which is what you're trying to prevent. Rather than adding individual exceptions for everything you want to deny, start the other way around. I.e., if you have files that you don't want to be served, then don't put them in the document root.
<VirtualHost *:80>
ServerName foo.com
DocumentRoot "/path/to/files/public"
</VirtualHost>
Note the DocumentRoot is now set to a subdirectory within your project. Put only your public assets in this directory. All other files go outside (i.e., above) public, and thus can not be served by Apache, while still allowing PHP to include them.
/path/to/files
config.php
/path/to/files/lib
db.php
etc.php
other_thing.php
/path/to/files/public
index.php

To protect your website from backdoor access files, you need to create a .htaccess file and upload it to your site’s desired directories.
Create a blank file named .htaccess and paste the following code inside it.
<Files *.php>
deny from all
</Files>

Add this below the <?php header, this will prevent a direct execution of php
defined('BASEPATH') OR exit('No direct script access allowed');

Related

Execute a script, but not let users access it [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to hinder PHP files to Global Access
For a project I'm working on, I require("xxxx.php"); to include certain parts of my website into my site.
For example, I have header.php and my site just uses a simple require("header.php") to display to the world.
Now, how do I make it so the page include header.php's content, but make it so the user can't access it via
http://mywebsite.com/header.php
Put the file somewhere outside the DocumentRoot of the webserver.
Setup the webserver so that:
You have a public directory, where your accessible files reside, say static media + index.php and so on
Have a resources directory, that is outside your public folder.
Setup the webserver to serve from the public directory
Include like this:
require("../private/header.php");
You could define a constant in your index file and then in your header.php you would check so this constant exists.
// check if the header.php file is accessed directly
if (!define('MY_SECRET_CONSTANT')) {
exit();
}
You can either work with htaccess to ensure that special files are not viewed. The same can be achieved through a virtualhost entry.
A best pratice is to create a /web folder which is "the root" for your application and store everything which should be accessable. Other folders go anywhere else and are included into scripts which are desgined to get accessed by the user.
Your structure could look like this:
/var/www/mySite/inc/header.php
/var/www/mySite/web/index.php (including ../inc/header.php)
/var/www/mySite/web/css/style.css
Your apache virtual host would look like this:
<VirtualHost *:80>
ServerName www.mysite.com
# Basic stuff
DocumentRoot "/var/www/mySite/web"
DirectoryIndex index.php
<Directory "/var/www/mySite/web">
AllowOverride All
Allow from All
</Directory>
</VirtualHost>
Now the inc folder cannot be accessed through your domain as your web folder is the root for everyone comming from the url. Your scripts on the other hand can of course navigate lower than that and including scripts from anywhere.
As you can see from the answers: there's a gazillion ways to do this, here's two pretty simple ones that do not require you to change your directory structure:
In index.php and other files that include header.php:
define('INCLUDED', true);
require 'header.php';
In header.php:
if(!defined('INCLUDED')){
die('Forbidden'); // or you could redirect to home ... whatever you want :)
}
Alternatively, you can forbid access via .htaccess. Thus, you don't even have to touch your code:
<Files header.php>
order allow,deny
deny from all
</Files>
If your includes are all in one directory, you can simply deny access to that directory.

Allowing only localhost to access a folder where all inclusive php files exist

We all have php files like 'connect_db.php' for include purposes only.
Suppose I have all those inclusive .php files in "www/html/INC"
And I have index.php
I want index.php accessible from browser to everyone, but I want to prevent users' direct access to "www/html/INC" folder. (e.g. when type in the browser 'www.domain.com/INC/' -> 404 error etc)
How do I achieve this?
Preferrably using .htaccess file in the root directory please.
Something like
<Directory /INC>
Order deny,allow
Deny from all
Allow from 127.0.0.1
</Directory>
should work.
How do I achieve this?
Don't. As per my previous answer, it's much more secure to put connect_db.php in /www/, not in /www/html/INC/. If you are going to do so, then you'd use /www/html/.htaccess:
<Directory "/www/html/INC">
order allow,deny
deny from all
</Directory>
Google searches have brought me here so I figured I'd post what I found in the apache docs today. I'm not positive what versions of apache it is available in, but do a search for your version to verify.
Now you can just use Require local. I'd recommend putting an .htaccess in the folder that you want to restrict access to with just that line. However, if you must do it in the root directory then here's what it would be:
<Directory "www/html/INC">
Require local
</Directory>
As of Apache 2.4, Require is the way to go.
E.g. the following denies access to the /www/html/INC directory by anyone except localhost:
<Directory "/www/html/INC">
Require all granted
Require ip 127.0.0.1
</Directory>
Move connect_db.php to the more high level in directories tree, than public directory.
And all scripts, which should not be executable - too.
/home/user/project/incs/ -- here your inclusive scripts
/home/user/project/www/html -- here your index.php and other executable scripts.

.htaccess: Disallow PHP files to access anything outside their directory

On .htaccess how would I be able to disallow PHP files from accessing anything outside their directory, without using open_basebdir()?
Basically I'm going to generate .htaccess files into some dynamically created sub directories that cannot interact outside of themselves.
EDIT: Sorry I meant accessing, not moving.
.htaccess files are for Apache, not PHP.
What you want to do sounds more like the job of a VirtualHost.
php doesn't even know about the existence of your .htaccess files, so they can't avoid any script from moving any file
Something like this?
<FilesMatch "^php5?\.(ini|cgi)$">
Order Deny,Allow
Deny from All
Allow from #Root dir of your php file
</FilesMatch>

How can I add one line into all php files' beginning?

So, ok. I have many php files and one index.php file. All files can't work without index.php file, because I include them in index.php. For example. if somebody click Contact us the URL will become smth like index.php?id=contact and I use $_GET['id'] to include contacts.php file. But, if somebody find the file's path, for example /system/files/contacts.php I don't want that that file would be executed. So, I figured out that I can add before including any files in index.php line like this $check_hacker = 1 and use if in every files beginning like this if($check_hacker <> 1) die();. So, how can I do it without opening all files and adding this line to each of them? Is it possible? Because I actually have many .php files. And maybe there is other way to do disable watching separate file? Any ideas? Thank you.
You could put your index.php alone in your web directory. And put all the files it includes in another non web directory.
Let's say you website http://www.example.com/index.php is in fact /path/to/your/home/www/index.php, you can put contact.php in /path/to/your/home/includes/contact.php. No .htaccess, rewrite, auto appending. Just a good file structure and a server configured like needed.
Edit to detail my comment about using xamp :
In your httpd.conf file, add something like this :
<Directory "/path/to/your/site/root">
Options Indexes FollowSymLinks
AllowOverride all
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
</Directory>
<VirtualHost *:80>
DocumentRoot /path/to/your/site/root
ServerName www.example.org
</VirtualHost>
Then in your windows hosts file (in C:\Windows\System32\drivers\etc), add this line :
127.0.0.1 www.example.com
I would highly recommend to use the .htaccess file to rejects all requests for files diffrent to index.php but I am not quite sure how to do that propperly.
This might work (can't test it now) but it will also block requests to css, js and so on:
order deny,allow
<FilesMatch "\.php">
deny from all
</FilesMatch>
<FilesMatch "(index.php)">
allow from all
</FilesMatch>
If someone knows the right solution, please edit my answer.
You might check this question: Deny direct access to all .php files except index.php
So you might have a FilesMatch only for php files in addition to the index.php rule.
EDIT: The new version of the code seems to work.
In response to Kau-Boy:
Place all your php files (except index.php) in a new directory and put the .htaccess file with the following contents:
deny from all
Make sure you don't put any images/css/jscript resources in this directory, because they will be blocked as well.
I'd use mod_rewrite in this case (if you are using Apache). It's much cleaner solution than writing gazillions of useless ifs in PHP.
This way, if someone wanted to "hack it" and tried /system/files/contacts.php, it'd redirect them to index.php?id=contact or whatever other site.
In your php.ini or in you htaccess set the following variable:
auto_prepend_file="[path to some .php file]"
This will include a header file of your choice that will be included before all php scripts on the system.
The php.ini directive auto_append_file, will create a footer that is included at the end of all PHP files on the system.
Check out the technique at http://www.electrictoolbox.com/php-automatically-append-prepend/
RewriteCond %{REQUEST_URI} system.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
Will redirect any attempt to system folder back to root!

Block direct access to a file over http but allow php script access

I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>

Categories