I am creating a cloud storage project, and I want users to be able to upload any file. In particular, I want people to be able to upload .htaccess files, but I don't want Apache using these files as this is a security concern. How can I prevent Apache from using the user uploaded file, while still using my own .htaccess file in a parent folder?
This question is helpful reading. Directives near the www-root are applied first, subfolders are used later and may overwrite previous settings.
There are some things you can do:
Don't use .htaccess files at all, not even in other directories. If you have a dedicated server, you can edit the server config file, which is much more efficient. It will allow you to set AllowOverride None, which will prevent Apache from using .htaccess files at all. Instead, you can accomplish the same by putting your rules in the server config file. You'll need to restart Apache every time you make a change, and making an error in the server config file will prevent Apache from starting until it is fixed.
Store your files as random characters without an extension, make it impossible to access any files directly, and instead rely on a database to map a filename to a file. This allows you to store files securely, while not dumping everything in your database.
You cannot put anything in your .htaccess file that would prevent .htaccess files in subdirectories to be ignored, because AllowOverride only works in directory context, not in .htaccess context.
Related
I need to know if after creating the .htaccess file I have to call it within the index page, knowing that the main page is unique (only index.php) and all pages are taken from a database.
The web server that I use Apache.
The file with filename .htaccess is an extension to the webserver configuration (most commonly httpd.conf) that is loaded automatically by Apache when a file or script is loaded or executed in the directory, or any child-directories, where the .htaccess file is placed.
Furthermore, php scripts (or any scripts for that matter) have no knowledge at all of the existence of a .htaccess, nor should they care. They can be depended on configuration settings however, eg. any rewrite rules that pipe all incoming requests through a so-called front-controller (most commonly index.php), but they do not know of it's existence. Any configuration could also be placed somewhere else in the configuration tree.
For further info I'd advise you to read about Apache, or webservers in general, and learn how a common (http) request is fulfilled. It'll give you some understanding of what the .htaccess file exactly does, and does not, and how it is related, or unrelated, to any scripts.
You don't need to explicitly call the htaccess file from any of the php pages
htaccess rules will automatically apply to all the files and sub folders within the specific folder where htaccess file is placed
You cannot call the .htaccess file, it is an instruction set to the server on how to handle requests (amongst other things). You should place it in the root directory and Apache will look for it automatically when a request to the server is made for any webpage or other file.
You can also have a .htaccess flie in each folder (directory) to control requests specific to it.
Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).
I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.
Storing the config outside of apache's document root is a must
You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.
I have a file sharing website in the making where I am allowing the visual and function part of pages work. This runs into a problem when I want to allow server side scripting like php pages to be uploaded. This php (etc.) page could easily back link and delete files which I obviously would not want. I have changed the permissions many times to test but this also stops my php files from uploading and renaming files to these folders. I do want to allow these file types but im not sure what I can do.
I was thinking I could do this through .htaccess but I wouldn't know how.
Any suggestions?
I'm not sure, but it sounds like you want to allow arbitrary file uploads (including .PHP scripts) but to prevent any of them from being executed on the server side.
I would recommend creating a file storage directory that is not web-accessible (e.g. put it outside your www-root or use a .htaccess file to limit direct access). Then have your PHP scripts upload to that directory. Create a download script and have download access to those files go through that script, so that e.g. PHP files cannot be invoked remotely.
If I understand correctly from reading comments:
You want users to be able to upload any file. Including code. Including .php, .asp etc.
You want the users to be able to execute this code, but to limit the code to a "sandbox" environment.
Seems to me you should write your files to a specific location, which has its own document root/vhost (http://exec.domain.tld).
On that vhost you could set security, ie:
AllowOverride None # disable rewriting and such
php value disable_functions dl,exec,passthru,system,shell_exec,popen # disable functions
And to top it off (!important) set basedir restrictions to the vhosts document root
<Directory /srv/www/exec.domain.tld/docroot>
php_admin_value open_basedir /srv/www/exec.domain.tld/docroot
</Directory>
I haven't actually set up this environment, but I feel this is your best starting point. And I do think it'll work, if you fix the typo's/parameter name errors i might have made :)
I think it's not about permission, but php execution.
You can turn off php engine on a directory using .htaccess file, like this:
<IfModule mod_php5.c>
php_flag engine off
</IfModule>
for example i have this url: http://localhost/miSite/uploads/ and by doing:
http://localhost/miSite/uploads/../includes/, this results in a directory (includes) linsting.
It'd be great if you could tell me a way to resolve this.
Directory Indexing
You can also use .htaccess to disable indexing, or Directory Browsing. By default, this option is turned on in the server's configuration files. To disable this, add this line to your .htaccess file:
Options -Indexes
The possibility of using relative references is not a real problem:
http://localhost/miSite/uploads/../includes/
resolves to
http://localhost/miSite/includes/
which can be addressed directly anyway. If you have sensitive files in there, you should move them outside the web root, or block the directory listing.
What would be a real problem is if the following would work:
http://localhost/../miSite/includes/
which would serve files outside the document root. But that will not happen with an up-to-date web server.
There's 3 things you can do, ranging from least secure to most secure.
Disable indexes as proposed by #Lizard
Make a rule in the htaccess file to deny access to folders people aren't allowed to access
Move the files that shouldn't be accessed outside of the DocumentRoot.