securing outward-facing website db configs - php

I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.

Storing the config outside of apache's document root is a must

You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.

Related

How can I make .htaccess files in a subfolder not execute?

I am creating a cloud storage project, and I want users to be able to upload any file. In particular, I want people to be able to upload .htaccess files, but I don't want Apache using these files as this is a security concern. How can I prevent Apache from using the user uploaded file, while still using my own .htaccess file in a parent folder?
This question is helpful reading. Directives near the www-root are applied first, subfolders are used later and may overwrite previous settings.
There are some things you can do:
Don't use .htaccess files at all, not even in other directories. If you have a dedicated server, you can edit the server config file, which is much more efficient. It will allow you to set AllowOverride None, which will prevent Apache from using .htaccess files at all. Instead, you can accomplish the same by putting your rules in the server config file. You'll need to restart Apache every time you make a change, and making an error in the server config file will prevent Apache from starting until it is fixed.
Store your files as random characters without an extension, make it impossible to access any files directly, and instead rely on a database to map a filename to a file. This allows you to store files securely, while not dumping everything in your database.
You cannot put anything in your .htaccess file that would prevent .htaccess files in subdirectories to be ignored, because AllowOverride only works in directory context, not in .htaccess context.

Allow access to file ONLY FROM specific PHP file/ dir on same server

I've been pulling my hair off trying to find a solution for denying access to a file for all AJAX/ PHP/ in-browser requests EXCEPT when they come from one (other) directory/ file on the same server.
I have a .htaccess in the directory of the JSON file. The file is fetched by a CMS plugin as follows (pretty standard):
Javascript makes an AJAX call to a PHP file on the server, called handler.php.
handler.php then retrieves the contents of the file and returns it to JS.
I can't use rewrite rules, as it is not a prereq. for the CMS. I also can't set a known domain name, as it is dynamic. Ideally I would do Allow from SELF.
I have tried the following .htaccess configs without luck:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath>
Or:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath/fileName.ext>
I tried with setEnv directive too, and a mixture of others involving using <Files>, <Directory> and <Location>. If I just Deny from All the PHP cannot access it. Is there any solution for restricting access to a JSON file?
Also, does it have anything to do with the fact that I am testing on localhost?
There are 2 types of access to the file.
From web, where .htaccess rules or server (apache) config is applied.
By app or script running on the server machine, where filesystem permissions are applied.
So you can deny all requests in .htaccess file, but PHP script still can access the file via FS (dependent on FS permissions).

Website backup.zip File on Hosting Server is safe from hacker or Robot?

Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).

Securing a text file in a website folder

I have several folders in my website directory which contains textfiles called "secure.txt".
For security reasons the URL of these files are never shown in the webbrowser but my website searches for these files (PHP code), which contains sensitive information.
How can I make the PHP code allowed to read these files but restrain access through the url, so a potential hacker wouldn't be able to read the content of these files?
put them out side your document root folder and place the following .htaccess file in the folder you want to protect. Also if you don't want to access it through a particular IP remove the last line.
order deny, allow
deny from all
allow from 192.168.0
[EDIT:]
To allow php scripts, etc. allow localhost (127.0.0.1)
order deny, allow
deny from all
allow from 127.0.0.1
You should put them in another folder and make the .htaccess deny from all, allow from 127.0.0.1
Old trick for that: Prefix the files with <?php die("I'm a fraid I can't do that, Jim"); ?>, and call them *.php. On parsing, ignore the prefix.
Edit
Why do this? The rationale behind it is, that you avoid a dependency on some special webserver configuration, which acn be forgotten (on moving to a new server), unavailable (many cheap hosts don't give you .htaccess), not applicable to some webserver software (IIS) etc.
So the reasoning is to trade some computational overhead against flexibility, portability and platform independence.
Can you move them out of your website directory altogether? If so, then make sure PHP has access to that directory! (The open_basedir value will need to include it.)
I'd suggest moving the files out of the webroot to be on the safe side
If you use Apache, deny access to all files named secure.txt from within httpd.conf:
<Files secure.txt>
deny from all
</Files>
You may do the same via .htaccess files as well (if your server is configured to allow override access rights from htaccess).
But a better solution would be to include the sensitive information into your PHP scripts themselves, or to store it in a database.

Disabling download of php files if PHP is not installed

My university has multiple servers which have the same data mirrored across them, so I can access for instance
foo.uni.edu/file.php
bar.uni.edu/file.php
The thing is, not all servers have PHP installed, so anyone could possibly download my php files if they made the connection through a server which didn't have PHP installed.
Is there a way, possibly with .htaccess to avoid this? As in, only allow opening PHP files if PHP server is installed?
If it's possible to store files outside of the document root, you could work around the problem by storing all sensitive data outside the docroot. You would then have your publicly accessible scripts use include to access those files.
So, if you upload to /username/public_html, and public_html is your document root (eg, foo.uni.edu/file.php is /username/public_html/file.php), then you would upload to /username/file.php instead and place another script in /username/public_html which merely contains something like include('../file.php');
This is good practice in any case, in case a configuration error on the server ever stops PHP from being parsed.
You could also try using IfModule and FilesMatch to deny access to PHP files if mod_php isn't enabled:
<IfModule !mod_php.c>
<FilesMatch "\.php$">
Order Deny,Allow
Deny from All
</FilesMatch>
</IfModule>
If this doesn't work, try !mod_php5.c instead.

Categories