I'm making a website which allows people to upload files, html pages, etc... Now I'm having a problem. I have a directory structure like this:
-/USERS
-/DEMO1
-/DEMO2
-/DEMO3
-/etc... (every user has his own direcory here)
-index.php
-control_panel.php
-.htaccess
Now I want to disable PHP, but enable Server-side includes in the direcories and subdirectories inside /USERS
Can this be done (and if so, how)?
I use WAMP server
Try to disable the engine option in your .htaccess file:
php_flag engine off
To disable all access to sub dirs (safest) use:
<Directory full-path-to/USERS>
Order Deny,Allow
Deny from All
</Directory>
If you want to block only PHP files from being served directly, then do:
1 - Make sure you know what file extensions the server recognizes as PHP (and dont' allow people to override in htaccess). One of my servers is set to:
# Example of existing recognized extenstions:
AddType application/x-httpd-php .php .phtml .php3
2 - Based on the extensions add a Regular Expression to FilesMatch (or LocationMatch)
<Directory full-path-to/USERS>
<FilesMatch "(?i)\.(php|php3?|phtml)$">
Order Deny,Allow
Deny from All
</FilesMatch>
</Directory>
Or use Location to match php files (I prefer the above files approach)
<LocationMatch "/USERS/.*(?i)\.(php3?|phtml)$">
Order Deny,Allow
Deny from All
</LocationMatch>
If you're using mod_php, you could put (either in a .htaccess in /USERS or in your httpd.conf for the USERS directory)
RemoveHandler .php
or
RemoveType .php
(depending on whether PHP is enabled using AddHandler or AddType)
PHP files run from another directory will be still able to include files in /USERS (assuming that there is no open_basedir restriction), because this does not go through Apache. If a php file is accessed using apache it will be serverd as plain text.
Edit
Lance Rushing's solution of just denying access to the files is probably better
<Directory /your/directorypath/>
php_admin_value engine Off
</Directory>
This will display the source code instead of executing it:
<VirtualHost *>
ServerName sourcecode.testserver.me
DocumentRoot /var/www/example
AddType text/plain php
</VirtualHost>
I used it once to enable other co-worker to have read access to the source code from the local network (just a quick and dirty alternative).
WARNING !:
As Dan pointed it out sometime ago, this method should never be used in production. Please follow the accepted answer as it blocks any attempt to execute or display php files.
If you want users to share php files (and let others to display the source code), there are better ways to do it, like git, wiki, etc.
This method should be avoided! (you have been warned. Left it here for educational purposes)
None of those answers are working for me (either generating a 500 error or doing nothing). That is probably due to the fact that I'm working on a hosted server where I can't have access to Apache configuration.
But this worked for me :
RewriteRule ^.*\.php$ - [F,L]
This line will generate a 403 Forbidden error for any URL that ends with .php and ends up in this subdirectory.
#Oussama lead me to the right direction here, thanks to him.
If you use php-fpm, the php_admin_value will NOT work and gives an Internal Server Error.
Instead use this in your .htaccess. It disables the parser in that folder and all subfolders:
<FilesMatch ".+\.*$">
SetHandler !
</FilesMatch>
This might be overkill - but be careful doing anything which relies on the extension of PHP files being .php - what if someone comes along later and adds handlers for .php4 or even .html so they're handled by PHP. You might be better off serving files out of those directories from a different instance of Apache or something, which only serves static content.
On production I prefer to redirect the requests to .php files under the directories where PHP processing should be disabled to a home page or to 404 page. This won't reveal any source code (why search engines should index uploaded malicious code?) and will look more friendly for visitors and even for evil hackers trying to exploit the stuff.
Also it can be implemented in mostly in any context - vhost or .htaccess.
Something like this:
<DirectoryMatch "^${docroot}/(image|cache|upload)/">
<FilesMatch "\.php$">
# use one of the redirections
#RedirectMatch temp "(.*)" "http://${servername}/404/"
RedirectMatch temp "(.*)" "http://${servername}"
</FilesMatch>
</DirectoryMatch>
Adjust the directives as you need.
I use in Centos 6.10 for multiple folders in virtual host .conf definitioin file:
<DirectoryMatch ^/var/www/mysite/htdocs/(nophpexecutefolder1|nophpexecutefolder2)>
php_admin_value engine Off
</DirectoryMatch>
However, even though it doesn't parse php code the usual way it still outputs from a .php things such as variable declarations and text when doing echo e.g.
<?php
echo "<strong>PHP CODE EXECUTED!!";
$a=1;
$b=2;
echo $a+$b;
The above produces in web browser?
PHP CODE EXECUTED!!"; $a=1; $b=2; echo $a+$b;
This could potentially expose some code to users which isn't ideal.
Therefore, it's probably best to use the above in combination with the following in .htaccess:
<FilesMatch ".*.(php|php3|php4|php5|php6|php7|php8|phps|pl|py|pyc|pyo|jsp|asp|htm|html|shtml|phtml|sh|cgi)$">
Order Deny,Allow
Deny from all
#IPs to allow access to the above extensions in current folder
# Allow from XXX.XXX.XXX.XXX/32 XXX.XXX.XXX.XXX/32
</FilesMatch>
The above will prevent access to any of the above file extensions but will allow other extensions such as images, css etc. to be accessed the usual way. The error when accessing .php:
Forbidden
You don't have permission to access /nophpexecutefolder1/somefile.php on this server.
<Files *.php>
Order deny,Allow
Deny from all
</Files>
Related
What can i do to stop Apache execute code in files that have .php prepended before extension ex: .php.txt , php.pdf , i do not know if this is related to webuzo admin panel n/or apache in general ?
Apache version 2.2.34
Opened thread on webuzo forum ,if anyone else has this issue , it might be related :
https://www.softaculous.com/board/index.php?tid=17642
This is reasonably standard behaviour - files can have multiple extensions on Apache. (As they can on other OS / filesystems.)
However, this behaviour can be avoided.
Whether files that end in .php.txt or .php.pdf are processed for PHP is dependent on how PHP is enabled on the server.
For example, if you simply use AddHandler then any file that contains a .php extension (like .php.txt) will be processed by the PHP handler:
AddHandler application/x-httpd-php .php
However, if you only call SetHandler on the specific file pattern, ie. when .php occurs at the end of the filename, then this behaviour can be avoided.
<FilesMatch "\.php$">
SetHandler application/x-httpd-php
</FilesMatch>
NB: This is not a copy/pastable solution - it really depends on how PHP is implemented on your Apache web server.
Depending on your requirements you could potentially block requests to files that contain a .php extension, but not at the end of the URL-path. For example:
<FilesMatch "\.php\.">
Order Allow,Deny
Deny from all
</FilesMatch>
NB: This is Apache 2.2 syntax (as stated in the question). If you are on Apache 2.4 then you'd use Require all denied instead of the Order and Deny directives in the last block.
I have a Rackspace Cloud Sites account and I'm trying to enable PHP for a single html file in a specific directory under my root site folder.
I have a .htaccess file in my root folder. And my document root from using $_SERVER['DOCUMENT_ROOT'] is /mnt/stor1-wc2-dfw1/myaccnum/servernum/mydomain.com/web/content(I've obscured a few details but the basic structure is intact).
I then do the following (which I saw on another SO answer):
<Directory "/mnt/stor1-wc2-dfw1/myaccnum/servernum/mydomain.com/web/content/mydir">
<Files "index.html">
AddType application/x-httpd-php .html .htm
</Files>
</Directory>
However, I'm getting error "500 Internal Server Error". If I take away <Directory> and <Files> everything works but for all files.
How can I fix this?!
You cannot add types per directory or location. However, you can manipulate the engine ini setting of PHP. Set it to off globally and enable it only for your index.html.
I would like to find out the most effective way to ban any executable files from one specific sub folder on my server. I allow file uploads by users into that folder, and would like to make that folder accessible from the web. I have the root folder pretty much locked down with mod_rewrite. In that one unprotected sub-folder I have .htaccess with:
Options +Indexes
IndexOptions +FancyIndexing +FoldersFirst +HTMLTable
RewriteEngine off
I know it is best to just restrict file uploads to a certain allowable file types, and I am already doing this in php. I am checking file extension, and mime type before allowing an upload like this:
$allmime=array('image/gif', 'image/png', 'image/jpeg', 'application/msword', 'application/pdf');
$allext=array('png', 'jpg', 'gif', 'doc', 'pdf');
$path=pathinfo($_FILES['file']['name']);
$mime=trim(shell_exec("file -bi " . $_FILES['file']['tmp_name']));
if( !in_array( $path['extension'], $allext) || !in_array($mime, $allmime) ){
//ban
}else{
//allow
}
However I am not certain if there is some convoluted hack out there that will still allow a shell script to be uploaded and executed on the server, since all of the successfully uploaded files will be visible immediately.
I know there is another option in .htaccess to filter out files like this:
<FilesMatch "\.(sh|asp|cgi|php|php3|ph3|php4|ph4|php5|ph5|phtm|phtml)$">
order allow, deny
deny from all
</FilesMatch>
However I am not certain that this list is all-inclusive, plus this is hard to maintain, as new extensions might be installed in the future.
To sum it all up: Anyone knows a good way to disallow all server executables, with the exception of php scripts directly executed by the %{HTTP_HOST}?
You can do several things to absolutely lock down certain folders to ensure PHP is not able to execute in them, particularly useful if doing a PHP upload script and you don't want the world to be able to execute arbitrary code on your server by exploiting your upload code:
disable the PHP engine entirely in .htaccess for the folder in question:
php_flag engine off
force the Content-Disposition header to attachment for files that are not of a finite list of file types you are expecting, for example:
ForceType application/octet-stream
Header set Content-Disposition attachment
<FilesMatch "(?i)\.(gif|jpe?g|png)$">
ForceType none
Header unset Content-Disposition
</FilesMatch>
prevent uploading of files with any extension which can be executed by an Apache module like PHP directly in your uploader code
How about disabling the server-side handlers for that specific directory? Something like:
<Directory /path/to/restrict>
SetHandler None
Options None
AllowOverride None
</Directory>
This is untested, but seems like it might work.
UPDATE: Apparently, I was wrong ... but sticking AddHandler default-handler in an .htaccess does seem to work.
From twitter's bootstrapper .htaccess, this works for me (just added exe):
# Block access to backup and source files.
# These files may be left by some text editors and can pose a great security
# danger when anyone has access to them.
<FilesMatch "(^#.*#|\.(exe|bak|config|dist|fla|inc|ini|log|psd|sh|sql|sw[op])|~)$">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
Results in:
Forbidden
You don't have permission to access /test/test.exe on this server.
The best way (imo) is just to turn off the x in the subfolder (executable permission in linux). So I would change the permissions to 644 (logged in you can read and write, but the world can only read). This can be done in cpanel. Make sure to apply that to sub folders as well.
Filtering by the uploaded filename is how malicious users will get bad things on to your server. The $_FILES name and type attributes are user supplied data and nothing says a user can't upload a PHP script but call it 'puppies.jpg'
Proper way to filter is to use something like Fileinfo and check actual MIME types and filter on that
Deny complete access to folder in an .htaccess file, and then use a download script to download the file, would save a lot of trouble.
Look here
The following rule will forbid .exe (i added .bat) files from being downloaded from your server:
<Directory "/my/files">
Require all granted
RewriteEngine on
RewriteRule "(\.exe|\.bat)" "-" [F]
</Directory>
So, ok. I have many php files and one index.php file. All files can't work without index.php file, because I include them in index.php. For example. if somebody click Contact us the URL will become smth like index.php?id=contact and I use $_GET['id'] to include contacts.php file. But, if somebody find the file's path, for example /system/files/contacts.php I don't want that that file would be executed. So, I figured out that I can add before including any files in index.php line like this $check_hacker = 1 and use if in every files beginning like this if($check_hacker <> 1) die();. So, how can I do it without opening all files and adding this line to each of them? Is it possible? Because I actually have many .php files. And maybe there is other way to do disable watching separate file? Any ideas? Thank you.
You could put your index.php alone in your web directory. And put all the files it includes in another non web directory.
Let's say you website http://www.example.com/index.php is in fact /path/to/your/home/www/index.php, you can put contact.php in /path/to/your/home/includes/contact.php. No .htaccess, rewrite, auto appending. Just a good file structure and a server configured like needed.
Edit to detail my comment about using xamp :
In your httpd.conf file, add something like this :
<Directory "/path/to/your/site/root">
Options Indexes FollowSymLinks
AllowOverride all
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
</Directory>
<VirtualHost *:80>
DocumentRoot /path/to/your/site/root
ServerName www.example.org
</VirtualHost>
Then in your windows hosts file (in C:\Windows\System32\drivers\etc), add this line :
127.0.0.1 www.example.com
I would highly recommend to use the .htaccess file to rejects all requests for files diffrent to index.php but I am not quite sure how to do that propperly.
This might work (can't test it now) but it will also block requests to css, js and so on:
order deny,allow
<FilesMatch "\.php">
deny from all
</FilesMatch>
<FilesMatch "(index.php)">
allow from all
</FilesMatch>
If someone knows the right solution, please edit my answer.
You might check this question: Deny direct access to all .php files except index.php
So you might have a FilesMatch only for php files in addition to the index.php rule.
EDIT: The new version of the code seems to work.
In response to Kau-Boy:
Place all your php files (except index.php) in a new directory and put the .htaccess file with the following contents:
deny from all
Make sure you don't put any images/css/jscript resources in this directory, because they will be blocked as well.
I'd use mod_rewrite in this case (if you are using Apache). It's much cleaner solution than writing gazillions of useless ifs in PHP.
This way, if someone wanted to "hack it" and tried /system/files/contacts.php, it'd redirect them to index.php?id=contact or whatever other site.
In your php.ini or in you htaccess set the following variable:
auto_prepend_file="[path to some .php file]"
This will include a header file of your choice that will be included before all php scripts on the system.
The php.ini directive auto_append_file, will create a footer that is included at the end of all PHP files on the system.
Check out the technique at http://www.electrictoolbox.com/php-automatically-append-prepend/
RewriteCond %{REQUEST_URI} system.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
Will redirect any attempt to system folder back to root!
I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>