for example i have this url: http://localhost/miSite/uploads/ and by doing:
http://localhost/miSite/uploads/../includes/, this results in a directory (includes) linsting.
It'd be great if you could tell me a way to resolve this.
Directory Indexing
You can also use .htaccess to disable indexing, or Directory Browsing. By default, this option is turned on in the server's configuration files. To disable this, add this line to your .htaccess file:
Options -Indexes
The possibility of using relative references is not a real problem:
http://localhost/miSite/uploads/../includes/
resolves to
http://localhost/miSite/includes/
which can be addressed directly anyway. If you have sensitive files in there, you should move them outside the web root, or block the directory listing.
What would be a real problem is if the following would work:
http://localhost/../miSite/includes/
which would serve files outside the document root. But that will not happen with an up-to-date web server.
There's 3 things you can do, ranging from least secure to most secure.
Disable indexes as proposed by #Lizard
Make a rule in the htaccess file to deny access to folders people aren't allowed to access
Move the files that shouldn't be accessed outside of the DocumentRoot.
Related
I am creating a cloud storage project, and I want users to be able to upload any file. In particular, I want people to be able to upload .htaccess files, but I don't want Apache using these files as this is a security concern. How can I prevent Apache from using the user uploaded file, while still using my own .htaccess file in a parent folder?
This question is helpful reading. Directives near the www-root are applied first, subfolders are used later and may overwrite previous settings.
There are some things you can do:
Don't use .htaccess files at all, not even in other directories. If you have a dedicated server, you can edit the server config file, which is much more efficient. It will allow you to set AllowOverride None, which will prevent Apache from using .htaccess files at all. Instead, you can accomplish the same by putting your rules in the server config file. You'll need to restart Apache every time you make a change, and making an error in the server config file will prevent Apache from starting until it is fixed.
Store your files as random characters without an extension, make it impossible to access any files directly, and instead rely on a database to map a filename to a file. This allows you to store files securely, while not dumping everything in your database.
You cannot put anything in your .htaccess file that would prevent .htaccess files in subdirectories to be ignored, because AllowOverride only works in directory context, not in .htaccess context.
Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).
I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.
Storing the config outside of apache's document root is a must
You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.
I have several folders in my website directory which contains textfiles called "secure.txt".
For security reasons the URL of these files are never shown in the webbrowser but my website searches for these files (PHP code), which contains sensitive information.
How can I make the PHP code allowed to read these files but restrain access through the url, so a potential hacker wouldn't be able to read the content of these files?
put them out side your document root folder and place the following .htaccess file in the folder you want to protect. Also if you don't want to access it through a particular IP remove the last line.
order deny, allow
deny from all
allow from 192.168.0
[EDIT:]
To allow php scripts, etc. allow localhost (127.0.0.1)
order deny, allow
deny from all
allow from 127.0.0.1
You should put them in another folder and make the .htaccess deny from all, allow from 127.0.0.1
Old trick for that: Prefix the files with <?php die("I'm a fraid I can't do that, Jim"); ?>, and call them *.php. On parsing, ignore the prefix.
Edit
Why do this? The rationale behind it is, that you avoid a dependency on some special webserver configuration, which acn be forgotten (on moving to a new server), unavailable (many cheap hosts don't give you .htaccess), not applicable to some webserver software (IIS) etc.
So the reasoning is to trade some computational overhead against flexibility, portability and platform independence.
Can you move them out of your website directory altogether? If so, then make sure PHP has access to that directory! (The open_basedir value will need to include it.)
I'd suggest moving the files out of the webroot to be on the safe side
If you use Apache, deny access to all files named secure.txt from within httpd.conf:
<Files secure.txt>
deny from all
</Files>
You may do the same via .htaccess files as well (if your server is configured to allow override access rights from htaccess).
But a better solution would be to include the sensitive information into your PHP scripts themselves, or to store it in a database.
Through researching, I discovered two common techniques to prevent clients from accessing libraries directly with a browser:
Use .htaccess to keep them out
Define a constant and pass it to included files, included files then checks if the constant exists.
However, just keeping those files out of the document root seems sensible. Is there anything wrong with this approach?
The best thing to do is keep it outside of your docroot. There is no reason to put includes in a directly HTTP-accessible place.
Some shared web hosts are poorly configured and don't have this option, but most do, and you definitely have this choice on your own server or VPS.
To complete the Brad's answer, here is how you could organize your folders:
/path/to/project/
public_html/
index.php
includes/
includes.php
Your webserver's root folder would be public_html.
If you can't modify this structure, the only acceptable way is to use a .htaccess (or equivalent) to prevent includes to be accessed publicly.
If you use an Apache Webserver, you could deny the access to all .inc.php files. you only have to add the following to your Apache Vhost config:
<FilesMatch .inc.php>
Order allow, deny
deny from all
</FilesMatch>
You can still include these files in your php code.
If the library files just define classes / functions / whatever, and your server isn't configured in a way that makes it possible to view the source, then nothing would be achieved by requesting the scripts via the web server anyway. That being said, if you can you might as well store them outside of the document root.