I would like to find out the most effective way to ban any executable files from one specific sub folder on my server. I allow file uploads by users into that folder, and would like to make that folder accessible from the web. I have the root folder pretty much locked down with mod_rewrite. In that one unprotected sub-folder I have .htaccess with:
Options +Indexes
IndexOptions +FancyIndexing +FoldersFirst +HTMLTable
RewriteEngine off
I know it is best to just restrict file uploads to a certain allowable file types, and I am already doing this in php. I am checking file extension, and mime type before allowing an upload like this:
$allmime=array('image/gif', 'image/png', 'image/jpeg', 'application/msword', 'application/pdf');
$allext=array('png', 'jpg', 'gif', 'doc', 'pdf');
$path=pathinfo($_FILES['file']['name']);
$mime=trim(shell_exec("file -bi " . $_FILES['file']['tmp_name']));
if( !in_array( $path['extension'], $allext) || !in_array($mime, $allmime) ){
//ban
}else{
//allow
}
However I am not certain if there is some convoluted hack out there that will still allow a shell script to be uploaded and executed on the server, since all of the successfully uploaded files will be visible immediately.
I know there is another option in .htaccess to filter out files like this:
<FilesMatch "\.(sh|asp|cgi|php|php3|ph3|php4|ph4|php5|ph5|phtm|phtml)$">
order allow, deny
deny from all
</FilesMatch>
However I am not certain that this list is all-inclusive, plus this is hard to maintain, as new extensions might be installed in the future.
To sum it all up: Anyone knows a good way to disallow all server executables, with the exception of php scripts directly executed by the %{HTTP_HOST}?
You can do several things to absolutely lock down certain folders to ensure PHP is not able to execute in them, particularly useful if doing a PHP upload script and you don't want the world to be able to execute arbitrary code on your server by exploiting your upload code:
disable the PHP engine entirely in .htaccess for the folder in question:
php_flag engine off
force the Content-Disposition header to attachment for files that are not of a finite list of file types you are expecting, for example:
ForceType application/octet-stream
Header set Content-Disposition attachment
<FilesMatch "(?i)\.(gif|jpe?g|png)$">
ForceType none
Header unset Content-Disposition
</FilesMatch>
prevent uploading of files with any extension which can be executed by an Apache module like PHP directly in your uploader code
How about disabling the server-side handlers for that specific directory? Something like:
<Directory /path/to/restrict>
SetHandler None
Options None
AllowOverride None
</Directory>
This is untested, but seems like it might work.
UPDATE: Apparently, I was wrong ... but sticking AddHandler default-handler in an .htaccess does seem to work.
From twitter's bootstrapper .htaccess, this works for me (just added exe):
# Block access to backup and source files.
# These files may be left by some text editors and can pose a great security
# danger when anyone has access to them.
<FilesMatch "(^#.*#|\.(exe|bak|config|dist|fla|inc|ini|log|psd|sh|sql|sw[op])|~)$">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
Results in:
Forbidden
You don't have permission to access /test/test.exe on this server.
The best way (imo) is just to turn off the x in the subfolder (executable permission in linux). So I would change the permissions to 644 (logged in you can read and write, but the world can only read). This can be done in cpanel. Make sure to apply that to sub folders as well.
Filtering by the uploaded filename is how malicious users will get bad things on to your server. The $_FILES name and type attributes are user supplied data and nothing says a user can't upload a PHP script but call it 'puppies.jpg'
Proper way to filter is to use something like Fileinfo and check actual MIME types and filter on that
Deny complete access to folder in an .htaccess file, and then use a download script to download the file, would save a lot of trouble.
Look here
The following rule will forbid .exe (i added .bat) files from being downloaded from your server:
<Directory "/my/files">
Require all granted
RewriteEngine on
RewriteRule "(\.exe|\.bat)" "-" [F]
</Directory>
Related
Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).
I have another question:
when i write localhost/folder/file.txt into browser, it opens and show the content of file.txt.
But I want make this file readable only by PHP, not by browser.
I tried everything with chmod but it doesn't work. Is it possible to do that?
Thanks
Write to a file outside the web root, then the web server won't make it available to clients. (There is no requirement for a file to be under the document root for PHP to read it).
Other options include:
Using your webserver's auth/authz systems to secure the file (not recommended for this problem as it is more likely that a configuration error will break the security then it is that the file will be placed in the wrong place)
Using a database instead
You could refuse access to the .txt extension.
.htaccess
# prevent viewing of a specific file
<Files file.txt>
order allow,deny
deny from all
</Files>
# multiple file types
<FilesMatch ".(htaccess|htpasswd|ini|phps|fla|psd|log|sh|txt)$">
Order Allow,Deny
Deny from all
</FilesMatch>
You can put the text file into a mySQL database as BLOB or TEXT. So it becomes impossible to read by browser, only by query (through php).
Simplest solution :
$s=file_get_contents('test.txt');
If the file has some code to execute, you can eval it.
eval(file_get_contents('test.txt'));
Have you tried chmod it to 660 ?
I just tried it using my web server, it is not available.
This is what I want an user to be able:
Upload ANY file to the server (attachment) to the uploads folder
Be Able to download it afterwards
So I have created this dir with the following .htaccess
Allow from all
DirectoryIndex .x
php_flag engine off
Options -Indexes
Options -ExecCGI
AddType text/plain .html .htm .shtml .php .php3 .php5 .phtml .phtm .pl .py .cgi
ForceType applicaton/octet-stream
My question is, is this secure?
I would like to say: no
It should be more secure if you deny access from all and manage the download via a script that deliveres the files.
Furthermore you should rename the files, so that there e.g. nobody places his own htaccess or whatever.
The original filenames you can store in a DB.
Why: You will never know what happens in the future, some files can later get executable, somewhere else you place an insecure script that allows users to include those uploaded files, and so on.
I also agree with Dr.Molle that you should rename the files and send them dynamically.
But instead of sending them via a script, which will take up much more memory than necessary, I highly recommend using mod_xsendfile for Apache.
With mod_xsendfile, instead of outputting the file through PHP, you can simply send the XSendFile headers:
<?php
header('Content-Disposition: attachment;filename=originalname.txt');
header('X-Sendfile: /path/to/file.txt');
?>
This way, you can keep all the files OUTSIDE the web directory root and therefore completely inaccessible to the outside world. You won't have to worry about .htaccess at all.
If your host allows you to install new Apache modules, you'll need apxs installed (it probably will be). If it's not installed, you'll need to rebuild Apache with apxs enabled. In my experience, if you can manage it, it's worth it. XSendFile saves SO much trouble.
I agree that it would be much better to download them via special script. But if it's not possible, do two things:
If you wish users to be able to download files, you can add attachment HTTP response header
Header set Content-disposition "attachment"
which will force browser to download file instead of rendering it.
Still, you have to make sure files won't be accessible through other potential vulnerabilities like File Inclusion.
Forbid execution for upload directory with chmod -R a-x
I'm working on a site that allows users to purchase digital content and have implemented a method that attempts to serve secure downloads.
I'm using CodeIgniter to force downloads like this:
$file = file_get_contents($path);
force_download("my_file_name.zip", $file);
Of course, I make sure the user has access to the file using a database before serving the download, but I was wondering if there was a way to make these files more secure.
I'm using a some 8-10 letter keys to create the file paths so urls to the files aren't exactly easy to figure out... something like http://mysite.com/as67Hgr/asdo0980/uth89.zip in lieu of http://mysite.com/downloads/my_file.zip.
Also, I'm using .htaccess to deny directory browsing like so: Options All -Indexes.
Other than that... I have no idea what steps to take. I've seen articles suggesting username and password methods using .htaccess, but I don't understand how to bypass the password prompt that would occur using that method.
I was hoping there might be a method where I could send a username and password combination using headers and cUrl (or something similar), but I wouldn't know where to start.
Any help would be hugely appreciated. Thanks in advance!
Make it so the web server does not serve the files under any circumstances, otherwise all the checking is pretty moot. The best way to do that is to put them somewhere outside the webroot. I.e.:
/
webroot/ <- root web directory, maybe named www or similar
index.php <- your app, served normally
…other serve-able files…
files/ <- not part of the serve-able webroot dir
secret_file <- web server has no access here
Then, if the only way to access them is through your script, it's as secure as you make your script.
why not to just Deny from All in the .htaccess? Or place files above webroot? That would be enough. But your current setup is pretty safe already. Why do you think you need any help?
.htaccess should look like this if you want them to only be downloadable from your localhost. Also, it removes some handlers that that could try to access any of the files, just in case. So that way only you have access to it. Also a good idea to store an index.php file in there that checks the existance of another file, and if exists, set the header, if not, exit.
.htaccess file:
<Files *>
Order Deny,Allow
Deny from all
Allow from localhost
</Files>
RemoveHandler .php .php3 .phtml .cgi .fcgi .pl .fpl .shtml
I'm making a website which allows people to upload files, html pages, etc... Now I'm having a problem. I have a directory structure like this:
-/USERS
-/DEMO1
-/DEMO2
-/DEMO3
-/etc... (every user has his own direcory here)
-index.php
-control_panel.php
-.htaccess
Now I want to disable PHP, but enable Server-side includes in the direcories and subdirectories inside /USERS
Can this be done (and if so, how)?
I use WAMP server
Try to disable the engine option in your .htaccess file:
php_flag engine off
To disable all access to sub dirs (safest) use:
<Directory full-path-to/USERS>
Order Deny,Allow
Deny from All
</Directory>
If you want to block only PHP files from being served directly, then do:
1 - Make sure you know what file extensions the server recognizes as PHP (and dont' allow people to override in htaccess). One of my servers is set to:
# Example of existing recognized extenstions:
AddType application/x-httpd-php .php .phtml .php3
2 - Based on the extensions add a Regular Expression to FilesMatch (or LocationMatch)
<Directory full-path-to/USERS>
<FilesMatch "(?i)\.(php|php3?|phtml)$">
Order Deny,Allow
Deny from All
</FilesMatch>
</Directory>
Or use Location to match php files (I prefer the above files approach)
<LocationMatch "/USERS/.*(?i)\.(php3?|phtml)$">
Order Deny,Allow
Deny from All
</LocationMatch>
If you're using mod_php, you could put (either in a .htaccess in /USERS or in your httpd.conf for the USERS directory)
RemoveHandler .php
or
RemoveType .php
(depending on whether PHP is enabled using AddHandler or AddType)
PHP files run from another directory will be still able to include files in /USERS (assuming that there is no open_basedir restriction), because this does not go through Apache. If a php file is accessed using apache it will be serverd as plain text.
Edit
Lance Rushing's solution of just denying access to the files is probably better
<Directory /your/directorypath/>
php_admin_value engine Off
</Directory>
This will display the source code instead of executing it:
<VirtualHost *>
ServerName sourcecode.testserver.me
DocumentRoot /var/www/example
AddType text/plain php
</VirtualHost>
I used it once to enable other co-worker to have read access to the source code from the local network (just a quick and dirty alternative).
WARNING !:
As Dan pointed it out sometime ago, this method should never be used in production. Please follow the accepted answer as it blocks any attempt to execute or display php files.
If you want users to share php files (and let others to display the source code), there are better ways to do it, like git, wiki, etc.
This method should be avoided! (you have been warned. Left it here for educational purposes)
None of those answers are working for me (either generating a 500 error or doing nothing). That is probably due to the fact that I'm working on a hosted server where I can't have access to Apache configuration.
But this worked for me :
RewriteRule ^.*\.php$ - [F,L]
This line will generate a 403 Forbidden error for any URL that ends with .php and ends up in this subdirectory.
#Oussama lead me to the right direction here, thanks to him.
If you use php-fpm, the php_admin_value will NOT work and gives an Internal Server Error.
Instead use this in your .htaccess. It disables the parser in that folder and all subfolders:
<FilesMatch ".+\.*$">
SetHandler !
</FilesMatch>
This might be overkill - but be careful doing anything which relies on the extension of PHP files being .php - what if someone comes along later and adds handlers for .php4 or even .html so they're handled by PHP. You might be better off serving files out of those directories from a different instance of Apache or something, which only serves static content.
On production I prefer to redirect the requests to .php files under the directories where PHP processing should be disabled to a home page or to 404 page. This won't reveal any source code (why search engines should index uploaded malicious code?) and will look more friendly for visitors and even for evil hackers trying to exploit the stuff.
Also it can be implemented in mostly in any context - vhost or .htaccess.
Something like this:
<DirectoryMatch "^${docroot}/(image|cache|upload)/">
<FilesMatch "\.php$">
# use one of the redirections
#RedirectMatch temp "(.*)" "http://${servername}/404/"
RedirectMatch temp "(.*)" "http://${servername}"
</FilesMatch>
</DirectoryMatch>
Adjust the directives as you need.
I use in Centos 6.10 for multiple folders in virtual host .conf definitioin file:
<DirectoryMatch ^/var/www/mysite/htdocs/(nophpexecutefolder1|nophpexecutefolder2)>
php_admin_value engine Off
</DirectoryMatch>
However, even though it doesn't parse php code the usual way it still outputs from a .php things such as variable declarations and text when doing echo e.g.
<?php
echo "<strong>PHP CODE EXECUTED!!";
$a=1;
$b=2;
echo $a+$b;
The above produces in web browser?
PHP CODE EXECUTED!!"; $a=1; $b=2; echo $a+$b;
This could potentially expose some code to users which isn't ideal.
Therefore, it's probably best to use the above in combination with the following in .htaccess:
<FilesMatch ".*.(php|php3|php4|php5|php6|php7|php8|phps|pl|py|pyc|pyo|jsp|asp|htm|html|shtml|phtml|sh|cgi)$">
Order Deny,Allow
Deny from all
#IPs to allow access to the above extensions in current folder
# Allow from XXX.XXX.XXX.XXX/32 XXX.XXX.XXX.XXX/32
</FilesMatch>
The above will prevent access to any of the above file extensions but will allow other extensions such as images, css etc. to be accessed the usual way. The error when accessing .php:
Forbidden
You don't have permission to access /nophpexecutefolder1/somefile.php on this server.
<Files *.php>
Order deny,Allow
Deny from all
</Files>