Right way to secure a web server through .htaccess - php

Hi i'm new in web programming. I'm developing a web site with a PHP backend application. I'm using Ubuntu 14.04 Server, Apache, PHP 5.5 and Mysql.
Currently this is my directory structure under /var/www/html:
example.com
app/ # this dir contains the backend
src/ # this contains some common stuffs between front and back ends
web/ # this is the 'public directory' which serves the frontend
I searched so much about .htaccess, but i cant point out a definitive solution to secure all .php files which are not into the web/ directory. Also, i would "hide" .php files through url rewriting (for example, instead of serve mysite.org/accounts.php i would serve mysite.org/accounts, but not just removing the .php extensions, rather redirecting mysite.org/accounts to a file called, to say, youwillneverknowthis.php).
Thanks in advance.
J.

To protect your PHP files, do not put them in the public html directory because it's accessible via the internet. Files that contain private code should be taken out of that directory. Put them in private directories such as /var/www/php/src and var/www/php/app
Now you want users to be able to get "page.php" without exposing the sensitive PHP code. One solution is to have a script in the public directory /var/www/html called gateway.php that looks in the private directory dir for a file named file and execute its contents. The values of $_GET come from step 4 below. Before requiring the file, be sure to sanitize input to prevent a malicious actor from making gateway.php run a bad script. For instance, you can compare the requested file against a whitelist of allowed files and stop execution if there is no match.
<?php #gateway.php
if (empty($_GET['file'])):
//the file is missing
http_response_code(400); //bad request
exit;
else:
$file = realpath(rawurlencode("/var/www/php/$_GET[dir]/$_GET[file]"));
//SAFE_SCRIPTS is a string constant: whitelist of allowed files
//eg:define("SAFE_SCRIPTS",'"path/to/a.php","path/to/b.php"')
if(strpos(SAFE_SCRIPTS,$file)===false):
//script is not in the whitelist
http_response_code(403);//Forbidden
trigger_error("Forbidden script: $file");
die("Access to this resource is forbidden");
endif;
require $file;
endif;
Your next issue was about url rewriting so that mysite.org/accounts could redirect to youwillneverknowthis.php). Place a text file named .htaccess in the public document root var/www/html. Add the following rule (one rule per line): RewriteEngine on. This turns on url rewriting
To redirect a url, add the following rule (single line): RewriteRule ^accounts$ gateway.php?dir=src&file=youwillneverknowthis.php [QSA,L] Adjust the dir and file arguments as required. This will redirect the user to gateway.php, which will then read and execute the private code, all behind the scenes. The Query String Append [QSA] flag ensures that any pre-existing query arguments are passed to gateway.php along with dir and file arguments. The [L] flag lets the rewriting engine know to stop processing and redirect upon finding a match.
Hope that helps.
Edit: improve security by sanitizing input

Related

How to protect my php files on the server from being requested

I'm very new to php and web , now I'm learning about oop in php and how to divide my program into classes each in .php file. before now all I know about php program, that I may have these files into my root folder
home.php
about.php
products.php
contact.php
So, whenever the client requests any of that in the browser
http://www.example.com/home.php
http://www.example.com/about.php
http://www.example.com/products.php
http://www.example.com/contact.php
No problem, the files will output the proper page to the client.
Now, I have a problem. I also have files like these in the root folder
class1.php
class2.php
resources/myFunctions.php
resources/otherFunctions.php
how to prevent the user from requesting these files by typing something like this in the browser ?
http://www.example.com/resources/myFunctions.php
The ways that I have been thinking of is by adding this line on top of every file of them exit;
Or, I know there is something called .htaccess that is an Apache configuration file that effect the way that the Apache works.
What do real life applications do to solve this problem ?
You would indeed use whatever server side configuration options are available to you.
Depending on how your hosting is set up you could either modify the include path for PHP (http://php.net/manual/en/ini.core.php#ini.include-path) or restricting the various documents/directories to specific hosts/subnets/no access in the Apache site configuration (https://httpd.apache.org/docs/2.4/howto/access.html).
If you are on shared hosting, this level of lock down isn't usually possible, so you are stuck with using the Apache rewrite rules using a combination of a easy to handle file naming convention (ie, classFoo.inc.php and classBar.inc.php), the .htaccess file and using the FilesMatch directive to block access to *.inc.php - http://www.askapache.com/htaccess/using-filesmatch-and-files-in-htaccess/
FWIW all else being equal the Apache foundation says it is better/more efficient to do it in server side config vs. using .htaccess IF that option is available to you.
A real-life application often uses a so-called public/ or webroot/ folder in the root of the project where all files to be requested over the web reside in.
This .htaccess file then forwards all HTTP requests to this folder with internal URL rewrites like the following:
RewriteRule ^$ webroot/ [L] # match either nothing (www.mydomain.com)
RewriteRule ^(.*)$ webroot/$1 [L] # or anything else (www.mydomain.com/home.php)
.htaccess uses regular expressions to match the request URI (everything in the URL after the hostname) and prepends that with webroot/, in this example.
www.mydomain.com/home.php becomes www.mydomain.com/webroot/home.php,
www.mydomain.com/folder/file.php becomes www.mydomain.com/webroot/folder/file.php
Note: this will not be visible in the url in the browser.
When configured properly, all files that are placed outside of this folder can not be accessed by a regular HTTP request. Your application however (your php scripts), can still access those private files, because PHP runs on your server, so it has filesystem access to those files.

Is it safe? Overwrite apache2 index.php for security

I want safe links and hide files from persons who dont know about the exact file. (like a dropbox link to a file)
I have question about security of apache2 related to this:
if i want to have access on files, if i know the filename, but dont want other people to have access via "browsing" this file, is it safe if i create a index.php in every folder without content and set apache2 to show index.php for the default index-page?
if i browse www.mytestpage.com/secretfolder/ i get without the empty index.php a list of all secretfiles12345.zip there.
if i specify index.php apache2 shows a emtpy page but i can still access mytestpage.com/secretfolder/secretfiles12345.zip .
guarantees this, that only persons who know the exact filename of secretfiles12345.zip has access to the file?
(very sorry for my bad english :) )
You need to set
Options -Indexes
in Apache2 configuration to prevent directory listings. You can do this in the global configuration file or in .htaccess.
Background: An empty index file (this need not be a php file, it could be a .html as well) only prevents access if you access the url without a trailing slash. Then Apache uses the default (index). If you have this trailing slash, it assumes that you really want to know what files are in the directory and retrieves the list - if you do not instruct it otherwise (with Options -Indexes, see above).

Chicken & Egg: How to keep PHP out of public_html if .htaccess needs to link to it?

I'm trying to build good security habits, so I'd like to move my sensitive PHP files outside the public_html directory. My setup is LAMP (Apache 2.4, PHP 5.5) with the following directory structure:
/home/username/public_html
/home/username/src
The sensitive files are in /src. My previous setup had that folder at /home/username/public_html/src and I had some .htaccess rules directing users there, such as
RewriteRule ^myaccount$ src/account.php
Now that the script is out of public_html, I'm not sure how to redirect it from htaccess (all paths in htaccess are relative to /home/username/public_html)
I read about the Alias directive from Apache docs which would allow me to set a directive such as Alias /src /home/username/src but that doesn't work in .htaccess and I would rather not change the httpd[-vhosts].conf unless that's the best way to solve my problem.
SOLUTION
credit: #Idealcastle (below) inspired a solution which allows me to keep my sensitive files out of public_html while still allowing htaccess to link to them.
I changed .htaccess directive from ^myaccount$ /src/account.php to ^myaccount$ redirect.php?file=account.php. I then created a gateway script which includes the script file I am trying to protect. redirect.php is in the public_html folder and has the contents below:
<?php
if (empty($_GET['file'])):
//the file argument is missing
http_response_code(400); //bad request
exit;
else:
$file = rawurlencode($_GET['file']);
$path = "/home/username/src/$file";
//Check the file against a whitelist of approved files. if it's not
//on the list, exit with http_response_code(403): Access Forbidden
//seek the file from its private location
require $path;
endif;
This code will insert the right file based on the value of the file argument set in .htaccess
You wouldn't want to redirect or make public access the /src directory as it defeats the purpose of what you are doing.
If I understand you correctly, how about linking specific files in /public_html to the ones in /src for script execution.
Such as /home/username/public_html/account.php
inside this file you would have
<?php include('/home/username/src/account.php'); ?>
Now the public can access files you allow it to. and the /src is protected from direct script execution.
And then for your htaccess, do the same thing,
RewriteRule ^myaccount$ /home/username/public_html/account.php
but instead make it go to the public html.

How to prevent http request from displaying resources in a PHP MVC / LAMP

I am restyiling my own php mvc framework.
I have every request handled by /index.php by default, which triggers the mvc process of routing, executing the request and returing a proper view. Each request is routed according to a single 'q' GET parameter, drupal style, like
/index.php?q=anApplication/aController/theAction/arg1/.../moreArguments
This works pretty good, and makes the clean url thing easy via mod_rewrite. Ok.
I have a directory tree like this:
/public
|----themeName
|--------|
|----page.tpl
|----content.tpl
|----etc.
/private
|----sourceDirectory
|----moreSources
What i dont want is files stored in private and public directories to be served directly like an HTTP request: i dont want something like
mySrv/public/themeName/page.tpl
to show a dead template, or any resource that is not an image, bypassing my core handler - index.php.
I think i could achieve something with a rewrite configuration like this
RewriteEngine on
RewriteBase /
RewriteRule ^(.*)$ index.php?q=$1 [QSA,L]
but then the framework would only work on mod_rewrite-enabled sites, because this will rewrite all existing and non-existing resources.
What i am asking for is: is there another way to make EVERY request served by a resource (such an index.php) of my choice, existing or non-existing ones?
Thank you
Just store all your templates etc outside of the public_html folder.
This will allow PHP to have access, but the outside world cannot get to the file.
The easiest and more portable way would be to pull everything except your index.php out of the document root. PHP can still include files and do everything else.
I have not tried this, but if you put an index.php outside the old document tree
---/ app / new-index.php
|
/ public /
|
/ private / ...
|
index.php
and then add at the beginning of new-index.php
<?php
chdir('..');
require 'index.php';
?>
and finally reconfigure Apache so that the DocumentRoot actually becomes /app, then everything should work as before -- except that any URLs but '/' stop making sense for Apache, and all can be made to land on a suitable ErrorDocument 404.
Note: "everything should work", except HTTP redirections. You can read a non-interpreted PHP file from the FS, but you can no longer get its interpreted content from, say, Apache on localhost. Also, you ought to verify that any existing code does not make use of the DOCUMENT_ROOT global variable; if necessary you may overwrite it.

Protect files in directory using authentication script in php/apache

I'm looking for a way to tell Apache that if there is a request for a file from a certain directory it should first run a php script to verify if the user is logged in.
I know I could put the directory outside of the docroot and let a php script handle the authentication and file downloads, but because these are flash files that try to open other flash files it has to be a directory in the docroot, and the files should not have to be send by the php script.
In the old setup we were using mod_auth_script(http://sourceforge.net/projects/mod-auth-script/), but as that is a rather obscure apache module I'd rather have a more common solution if possible.
You can use .htaccess and mod_rewrite to redirect requests to php script. Try some googling and you will find lots of examples.
.htaccess contents example:
Options +FollowSymLinks
RewriteEngine on
RewriteRule ([0-9a-z-_]+)\.swf$ checkForAuth.php?&file=$1 [L]
This will call checkForAuth.php when someone will try to access *.swf file. In checkForAuth.php you need to check your session, read contents from $_GET['file'], set correct headers (content-type for flash) and output contents of requested SWF file.

Categories