I'm using PHP's virtual() function to perform a sub-request to Apache, in order initiate a file download (the files can be heavy, so I can't use readfile()). The files in question are stored in a non-public directory, since the user's permissions need to be checked before allowing a download.
My project is structured as follows:
project/
.htaccess -----> rewrite everything to index.php
index.php -----> framework + application logic + permissions check
private/
.htaccess -----> deny from all
... -----> private files
The first problem I encountered was that the sub-request generated by virtual() was getting rewritten by the first .htaccess, so the download never started. This was easy to fix, since there is a [NS] (no sub-request) rewrite flag that allows not to rewrite URLs in such cases:
...
RewriteRule ^ index.php [NS]
...
But I still can't make it work because of the other .htaccess (deny from all), that simply rejects all requests and sub-requests.
The question: is there any way to configure .htaccess to deny access from all, except when the request is actually a sub-request coming from the server itself?
First of all, a very interesting problem. Had to dig a bit to figure it out.
You can utilize apache_setenv function here.
Have this PHP code before virtual call:
apache_setenv('internal', '1'); // sets an Apache var with name internal
virtual ( "/private/file.txt" ); // example sub request
exit;
Now inside /private/.htaccess have this snippet:
Order deny,allow
Deny from all
Allow from env=internal
This will deny all requests except when it there is an env variable internal is set to 1. That internal variable is only getting set in your PHP code hence only sub-requests will be allowed and all others will be denied.
Related
I'm very new to php and web , now I'm learning about oop in php and how to divide my program into classes each in .php file. before now all I know about php program, that I may have these files into my root folder
home.php
about.php
products.php
contact.php
So, whenever the client requests any of that in the browser
http://www.example.com/home.php
http://www.example.com/about.php
http://www.example.com/products.php
http://www.example.com/contact.php
No problem, the files will output the proper page to the client.
Now, I have a problem. I also have files like these in the root folder
class1.php
class2.php
resources/myFunctions.php
resources/otherFunctions.php
how to prevent the user from requesting these files by typing something like this in the browser ?
http://www.example.com/resources/myFunctions.php
The ways that I have been thinking of is by adding this line on top of every file of them exit;
Or, I know there is something called .htaccess that is an Apache configuration file that effect the way that the Apache works.
What do real life applications do to solve this problem ?
You would indeed use whatever server side configuration options are available to you.
Depending on how your hosting is set up you could either modify the include path for PHP (http://php.net/manual/en/ini.core.php#ini.include-path) or restricting the various documents/directories to specific hosts/subnets/no access in the Apache site configuration (https://httpd.apache.org/docs/2.4/howto/access.html).
If you are on shared hosting, this level of lock down isn't usually possible, so you are stuck with using the Apache rewrite rules using a combination of a easy to handle file naming convention (ie, classFoo.inc.php and classBar.inc.php), the .htaccess file and using the FilesMatch directive to block access to *.inc.php - http://www.askapache.com/htaccess/using-filesmatch-and-files-in-htaccess/
FWIW all else being equal the Apache foundation says it is better/more efficient to do it in server side config vs. using .htaccess IF that option is available to you.
A real-life application often uses a so-called public/ or webroot/ folder in the root of the project where all files to be requested over the web reside in.
This .htaccess file then forwards all HTTP requests to this folder with internal URL rewrites like the following:
RewriteRule ^$ webroot/ [L] # match either nothing (www.mydomain.com)
RewriteRule ^(.*)$ webroot/$1 [L] # or anything else (www.mydomain.com/home.php)
.htaccess uses regular expressions to match the request URI (everything in the URL after the hostname) and prepends that with webroot/, in this example.
www.mydomain.com/home.php becomes www.mydomain.com/webroot/home.php,
www.mydomain.com/folder/file.php becomes www.mydomain.com/webroot/folder/file.php
Note: this will not be visible in the url in the browser.
When configured properly, all files that are placed outside of this folder can not be accessed by a regular HTTP request. Your application however (your php scripts), can still access those private files, because PHP runs on your server, so it has filesystem access to those files.
I'm working on a PHP project using Apache 2.2.22 and PHP 5.3.10 and I'm running into an issue where index and index.php are being treated as the same file.
I have an admin/index.php that redirects to admin/index to allow my mod_rewrite rules in .htaccess to take over and reroute the request into a custom framework. The problem is, when the browser goes to admin/index it goes into an infinite redirect loop because the request is being sent to admin/index.php which redirects to admin/index
I've tried removing the htaccess file to see if there was a problem with my mod_rewrite rules that was causing it and it didn't change anything. It just redirects to admin/index endlessly.
I've never heard of this behavior before, skimming over some Google results and skimming through the apache configuration files didn't show anything really obvious. Has anyone seen this before and know how to fix it?
EDIT:
Below is the code being used by the index.php to redirect to index.
<?php
header("Location: index");
die();
This may be due to MultiViews being enabled:
The effect of MultiViews is as follows: if the server receives a
request for /some/dir/foo, if /some/dir has MultiViews enabled, and
/some/dir/foo does not exist, then the server reads the directory
looking for files named foo.*, and effectively fakes up a type map
which names all those files, assigning them the same media types and
content-encodings it would have if the client had asked for one of
them by name. It then chooses the best match to the client's
requirements.
— https://httpd.apache.org/docs/2.2/content-negotiation.html#multiviews
Try adding Options -MultiViews to your .htaccess
Enable rewrite Logging inside Apache and raise the log level. That way apache will tell you exactly, step by step, what request is rewritten how, in which order and why.
I'm working on a solution to a problem where users could potentially access images (in this case PDF files) stored in a folder off the server root. Normally, my application validates users through PHP scripts and sessions. What isn't happening right now is preventing non-logged in users from potentially accessing the PDFs.
The solution I'm looking for would (I think) need to be tied in with Apache. I saw an interesting solution using RewriteMap & RewriteRule, however the example involved putting this in an .htaccess file in the PDF directory. Can't do that with Apache (error: RewriteMap not allowed here). I believe the rewrite directives need to go in my httpd.conf, which I have access to.
So the example I found (that resulted in 'rewritemap not allowed here') is here:
RewriteEngine On
RewriteMap auth prg:auth.php
RewriteRule (.*) ${auth:$1}
auth.php just checks PHP session and redirects to a login script if needed.
I'm reading that I have to place this in my httpd.conf. How would I specify that the RewriteMap should only occur on a specific directory (including subdirectories)?
1st, be sure that you have to put that directly in httpd.conf. On Debian system, for instance, you have 1 file by virtualhost (a virtualhost usually is a website)
So, you have to put your rewriteMap in a "directory" like this:
<Directory /full/path/to/your/pdfs>
RewriteEngine on
...
</Directory>
I am creating a PHP application and I'm having a bit of trouble finding a solution for a problem I'm having. I need to somehow completely deny access to anyone trying to access files on a web server (likely by returning a 403 Forbidden error or something) via HTTP unless a certain condition is true, such condition would be checked on a per-connection basis. I need to find the best way to do this, I'm guessing I need to set some special settings in Apache that I can modify with PHP, but these Apache settings much obviously be configurable via PHP. As you can guess, I can write PHP well but have little experience with advanced Apache configurations.
I was thinking on that maybe if I used chmod via PHP to change the file's permissions for a validated user and have them change back when the connection is closed it would work, but if there are concurrent connections then the users connecting afterwords would have full access regardless of whether or not they are valid, the could actually just bypass the validation. Maybe there is a better way to do it like this however.
Thanks very much for the help!
put your files into a directory and deactivate http access via .htaccess. then write a php script that checks that condition and if it is true then return the requested file via php like this:
<?php
define(DIR, "save_folder/");
$filename='the_file_to_show.pdf';
$fileextension= explode(".", $filename);
header("Content-type: application/$fileextension[1]");
header("Content-Disposition: attachment; filename=".$filename."");
header("Content-Length: ".filesize(DIR.$filename));
readfile(DIR.$filename);
?>
put this into your .htaccess
<Directory ~ "save_folder">
Order allow,deny
Deny from all
</Directory>
It really depends on the "conditions" that you're checking, however you won't need to mess with chmod. If the "conditions" are all related to the HTTP request itself (ie- send the file based on the file requested, the query string, the IP address accessing, etc.) then you can do this strictly with .htaccess
<IfModule mod_rewrite.c>
RewriteEngine On
Rewrite Base /
RewriteCondition {...}
RewriteRule (.*) - [F]
</IfModule>
This will redirect them to a "Forbidden" header if they match the conditions specified in {...}. See This resource for some examples.
If you need more control or if you want to deny files based on something more specific (for instance- send a 403 error if they are not logged in) then you'll want to redirect to a PHP script.
.htaccess:
<IfModule mod_rewrite.c>
RewriteEngine On
Rewrite Base /
RewriteRule (.*) parse.php
</IfModule>
parse.php:
if({conditions}){
header("HTTP/1.0 403 Forbidden"); // 403 error!
} else {
/* include() the file if it's PHP, otherwise just echo the file contents */
}
Last night I made some admin changes to my webserver. I use php. The php processor failed after the update and if someone went to my homepage, the php page would simply download and show the proprietary code and password to anyone visiting. So I was wondering if there is a way to prevent any form of download for php files using .htaccess -- but still allow for normal viewing of the files.
A good pattern to follow during development is to use a minimal initialization file, which invokes the actual application which resides outside the webroot. That way only a minimal stub with no critical information is exposed in a case like this.
Simplified example:
/
/app
critical_code.php
/webroot
.htaccess <- rewrites all requests to index.php
index.php <- invokes ../app/critical_code.php (or other files as requested)
The trouble here is that either .htaccess is serving your files to the user or it's not. You can't tell it to deny access to the .php files, because then access will be denied during normal use, as well. There is no fallback behavior for the PHP processor simply not running correctly.
Maybe it's worth temporarily moving the web root to point to an "under maintenance" site when doing big things like that, to minimize risk as much as possible.
Assuming you're using Apache, your .htaccess file would look something like this.
<FilesMatch ".*\.php">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
<IfModule php5_module>
<FilesMatch ".*\.php">
Allow from all
Satisfy All
</FilesMatch>
</IfModule>
The first rule denies access to all .php files. By default, the user will see a 403 (Forbidden) error.
If the PHP5 module successfully loads, the second rule will take affect, which grants access.