Why needed own php files security, when already given .htaccesss security? - php

In kohana framework, in .htaccess file is writen
# Protect application and system files from being viewed
RewriteRule ^(?:application|modules|system)\b.* index.php/$0 [L]
Ok, But why needed this security in each php file :
<?php defined('SYSPATH') or die('No direct access allowed.');
?
attacker alredy can not open any .php file directly right? (becuase reason is protecte from .htaccess RewriteRule)

Simpy: It's a fallback. Thats all, but I need more characters, to publish this answer

It seems like the developers wanted to make sure no files can be accessed, no matter if the .htaccess works or not (i.e. disabled mod_rewrite).
But for files that only contain class definitions or return/define configuration arrays it is pretty useless anyway, since they don't output anything.

You can't be sure - from a framework developer point of view - that the webserver, that your product will be run with, is correctly set up (e.g. .htaccess/RewriteEngine not enabled by AllowOverride or no mod_rewrite ...).
this kind of « second check » is there to ensure that the framework won't leak sensitive data even on badly set up hosting.

.htaccess works only on Apache with mod_rewrite enabled. If the server does not meet any of these condition those SYSPATH checks comes in handly.
Note: not every user can use Aapche as web-server. And Not every user has access to .htaccess.
There are other alternatives nowadays. Like, nginx.

Related

How to protect my php files on the server from being requested

I'm very new to php and web , now I'm learning about oop in php and how to divide my program into classes each in .php file. before now all I know about php program, that I may have these files into my root folder
home.php
about.php
products.php
contact.php
So, whenever the client requests any of that in the browser
http://www.example.com/home.php
http://www.example.com/about.php
http://www.example.com/products.php
http://www.example.com/contact.php
No problem, the files will output the proper page to the client.
Now, I have a problem. I also have files like these in the root folder
class1.php
class2.php
resources/myFunctions.php
resources/otherFunctions.php
how to prevent the user from requesting these files by typing something like this in the browser ?
http://www.example.com/resources/myFunctions.php
The ways that I have been thinking of is by adding this line on top of every file of them exit;
Or, I know there is something called .htaccess that is an Apache configuration file that effect the way that the Apache works.
What do real life applications do to solve this problem ?
You would indeed use whatever server side configuration options are available to you.
Depending on how your hosting is set up you could either modify the include path for PHP (http://php.net/manual/en/ini.core.php#ini.include-path) or restricting the various documents/directories to specific hosts/subnets/no access in the Apache site configuration (https://httpd.apache.org/docs/2.4/howto/access.html).
If you are on shared hosting, this level of lock down isn't usually possible, so you are stuck with using the Apache rewrite rules using a combination of a easy to handle file naming convention (ie, classFoo.inc.php and classBar.inc.php), the .htaccess file and using the FilesMatch directive to block access to *.inc.php - http://www.askapache.com/htaccess/using-filesmatch-and-files-in-htaccess/
FWIW all else being equal the Apache foundation says it is better/more efficient to do it in server side config vs. using .htaccess IF that option is available to you.
A real-life application often uses a so-called public/ or webroot/ folder in the root of the project where all files to be requested over the web reside in.
This .htaccess file then forwards all HTTP requests to this folder with internal URL rewrites like the following:
RewriteRule ^$ webroot/ [L] # match either nothing (www.mydomain.com)
RewriteRule ^(.*)$ webroot/$1 [L] # or anything else (www.mydomain.com/home.php)
.htaccess uses regular expressions to match the request URI (everything in the URL after the hostname) and prepends that with webroot/, in this example.
www.mydomain.com/home.php becomes www.mydomain.com/webroot/home.php,
www.mydomain.com/folder/file.php becomes www.mydomain.com/webroot/folder/file.php
Note: this will not be visible in the url in the browser.
When configured properly, all files that are placed outside of this folder can not be accessed by a regular HTTP request. Your application however (your php scripts), can still access those private files, because PHP runs on your server, so it has filesystem access to those files.

Trying to set up homepage with .htaccess

So I recently purchased a domain. I know how to make websites, so I uploaded a website that I made onto that domain. The only problem is is that it sends me to my index and then I have to click on some folders to actually get to see my website on my screen. I know what .htaccess is and does, but I'm not sure how to use it.
What I want is that when I go to www.mydomain.com it should open up my home.php file from the website that I made. This is my file order:
project/PHP/home.php
I'm not sure if I've given enough information, but I hope someone can help me out here.
As correctly written by #JimL in the comments above we would recommend that you simply replace home.php to index.php, since that is the default setting really all http servers are configured to use for the index document.
You can however change that, even if you can only use .htaccess style files:
DirectoryIndex home.php
That said I still would recommend to rename the index file instead. .htaccess style files should be avoided whenever possible. They are notoriously error prone, hard to debug and they really slow the http server down, often without reason. They are only offered for cases where you really need to do some configuration tweaks but do not have control over the http servers host configuration. That is for example often the case when using a cheap web space provider.
Considering the additional information you gave in the comment below you could also try to rewrite all requests to point to the php scripts inside that folder project/php. For that you can place such rewriting rules inside a .htaccess style file:
RewriteEngine on
RewriteCond %{REQUEST_URI} !^/project/php
RewriteRule ^(.*)$ project/php/$1 [L,QSA]
If you also have to handle requests that require different rewriting then obviously you need additional rules.
But as already said in the comments below this is painful, slows the server down and makes things harder to debug.
Put your files in the public_html folder, or /var/www, you don't need the .htacces to do this.

mod_xsendfile won't work with CGI and mod_rewrite

I am trying to use the apache module xsendfile to get a better performance during file streaming.
The problem is, that it is just working if I DO NOT use PHP as CGI Version AND if I DO NOT USE rewrite rules for my urls.
Problem 1: mod_rewrite
Calling this one in the Browser will work:
http://subdomain.domain.de/image.php
This one will give me a 404:
http://subdomain.domain.de/rewrittenImageUrl
The rewrite rules are working right. The 404 error is triggered by the xsendfile module.
If I add a "R" to the rule in the htaccess (like suggested in this question) it will work again, because I am redirected to the first address given above. But redirecting is not what I want.
I also watched out this post about symlinks, but I think this could not be a solution for my post as long as I use absolute paths generated by using getenv('document_root')? This pathes shouldn't use any symbolic links, do they? Or am I missing something at this point?
Problem 2: CGI
If I switch the PHP mode to the cgi version I will get a 0 byte file. This seems to be the same behaviour like the server would react without any installation of xsendfile.
I have already updated the module to the latest version.
Also tested absolute and relative links without any success.
In addition to that Deactivating the output compression didn't work.
To complete the given information here is the PHP code I am using:
ini_set('zlib.output_compression', 'Off');
$realImagePath = getenv('document_root')."fixedRelativeImagePathNoParamsNeeded.jpg";
$imageInfos = #getimagesize($realImagePath);
header('Content-Type: '.$imageInfos['mime']);
header("X-Sendfile: $realImagePath");
exit();
Anyone has a clue?
Update (2014-01-06)
The second problem is solved:
I don't know why but it's working after turning on xsendfile in the apache config instead of using the htaccess file. (I will add an answer below as soons as the other problem is solved, too.)
In addition to the first one:
First I did not add any options in the httpd.conf as it should be working with the standard configuration. Anyway I now asked my provider to add the absolute project path to the whitelist of XSendFilePath as a global setting. This temporarily solved the 1. Problem with mod_rewrite. But this just seems to be not a real solution for my situation, because I am running many different projects on the server, each with a separated download path. So I would need to ask my provider to add a new path to the config each time I am starting a new project.
I still just can't use x-sendfile with mod_rewrotite although I should have access to the document root without any extra settings.
mod_xsendfile does construct absolute paths from relative paths based on the request URI, which might in fact be a sub-request in some cases. This does not always produce the results you expect.
Therefore, I would always use absolute URIs in the X-SENDFILE header.
The second consequence of the above is that the constructed path, not necessarily being what you expect, auto-whitelist a directory you do not expect, while then of course not whitelisting the directory you would have expected.
Therefore, always white-list URIs.
In short, Apache might not consider the same directory to be the current working directory of a request as you (or a backend *CGI). Rewriting only adds to the confusion.
I actually considered dropping the relative paths/auto-whitelist stuff altogether for this reasons, because it can be admittedly very confusing, and is also under-documented.
I guess it is a little late to drop it entirely, but I should at least mark it deprecated.
In your HT access file of main website try to use [L,R=301] in place of just [QSA,L] or [L].
For instance in place to use:
RewriteRule ^download/(.*)/$ download.php?x=$1 [QSA,L]
Use:
RewriteRule ^download/(.*)/$ download.php?x=$1 [L,R=301]
Tell me if resolved your problem guys !

Protecting images from direct access by checking current PHP session with mod_rewrite

I'm working on a solution to a problem where users could potentially access images (in this case PDF files) stored in a folder off the server root. Normally, my application validates users through PHP scripts and sessions. What isn't happening right now is preventing non-logged in users from potentially accessing the PDFs.
The solution I'm looking for would (I think) need to be tied in with Apache. I saw an interesting solution using RewriteMap & RewriteRule, however the example involved putting this in an .htaccess file in the PDF directory. Can't do that with Apache (error: RewriteMap not allowed here). I believe the rewrite directives need to go in my httpd.conf, which I have access to.
So the example I found (that resulted in 'rewritemap not allowed here') is here:
RewriteEngine On
RewriteMap auth prg:auth.php
RewriteRule (.*) ${auth:$1}
auth.php just checks PHP session and redirects to a login script if needed.
I'm reading that I have to place this in my httpd.conf. How would I specify that the RewriteMap should only occur on a specific directory (including subdirectories)?
1st, be sure that you have to put that directly in httpd.conf. On Debian system, for instance, you have 1 file by virtualhost (a virtualhost usually is a website)
So, you have to put your rewriteMap in a "directory" like this:
<Directory /full/path/to/your/pdfs>
RewriteEngine on
...
</Directory>

Enable mod_rewrite On Shared Hosting Apache Server

I have made some changes to a clients website.
The client was continually being attacked using SQL injection, and at the moment the URL contains variables that the website needs (i.e. index.php?filenmae=home.php).
So after securing the site as best I could using mysql_real_escape_strings and stripslashes, I then came to do URL rewriting in Apache.
At the moment, the server the website is currently on doesn't support mod_rewrite (i've checked using phpinfo) and it's not a server belonging to us. Is there anything I can do in my .htaccess file that would enable mod_rewrite for this website?
If mod_rewrite is installed, you can configure it in your local .htaccess file.
Create a file called .htaccess in your site's root folder.
First line should be RewriteEngine On.
Second line should be RewriteBase /.
After that, put in your rewrite rules are required.
If it isn't installed, you're out of luck - no web host will install extra software on a shared hosting box just for one client.
Mick, the best solution for you is to change your code. I'm guessing that in your code you then include the filename specified, e.g.
include $_GET['filename'];
In short, there is no way using mod_rewrite that you can make this secure.
However, you can make it more secure very easily by checking that the filename is valid, e.g.
$valid_filenames = array('home.php', 'foo.php', 'bar.php', /* etc... */);
if (!in_array($_GET['filename'], $valid_filenames)) {
echo "Invalid request.";
exit;
}
include $_GET['filename'];
Just make sure that you validate the requested filename before including it and you'll be much better off.
No, you cannot dynamically load mod_rewrite. Most hosting providers have mod_rewrite enabled on Apache servers. If they do not, you could ask them for enabling it. Otherwise, if you really need mod_rewrite, consider switching hosting providers.
As an alternative, you can rewrite URL's in PHP.
$_SERVER['QUERY_STRING'] can be used for getting the part after the question mark (http://example.com/file.php?this_part).
Split it by your preferred parameter separator (e.g. /, ;) using explode('/', $_SERVER['QUERY_STRING'])
Loop through the values, and split those using a preferred value separator (e.g. '=', ':')
Overwrite $_GET with an empty array, and put the newly generated values in it.
Note: filter_input and related functions do not operate on $_GET. Thus, this method will not work for filter_input.
For Shared Hosting Server , It Really Work.
Create a file called .htaccess in your site's root folder.
First line should be RewriteEngine On.
Second line should be RewriteBase /.
After that, put in your rewrite rules are required.

Categories