I'm working on a PHP project using Apache 2.2.22 and PHP 5.3.10 and I'm running into an issue where index and index.php are being treated as the same file.
I have an admin/index.php that redirects to admin/index to allow my mod_rewrite rules in .htaccess to take over and reroute the request into a custom framework. The problem is, when the browser goes to admin/index it goes into an infinite redirect loop because the request is being sent to admin/index.php which redirects to admin/index
I've tried removing the htaccess file to see if there was a problem with my mod_rewrite rules that was causing it and it didn't change anything. It just redirects to admin/index endlessly.
I've never heard of this behavior before, skimming over some Google results and skimming through the apache configuration files didn't show anything really obvious. Has anyone seen this before and know how to fix it?
EDIT:
Below is the code being used by the index.php to redirect to index.
<?php
header("Location: index");
die();
This may be due to MultiViews being enabled:
The effect of MultiViews is as follows: if the server receives a
request for /some/dir/foo, if /some/dir has MultiViews enabled, and
/some/dir/foo does not exist, then the server reads the directory
looking for files named foo.*, and effectively fakes up a type map
which names all those files, assigning them the same media types and
content-encodings it would have if the client had asked for one of
them by name. It then chooses the best match to the client's
requirements.
— https://httpd.apache.org/docs/2.2/content-negotiation.html#multiviews
Try adding Options -MultiViews to your .htaccess
Enable rewrite Logging inside Apache and raise the log level. That way apache will tell you exactly, step by step, what request is rewritten how, in which order and why.
Related
I'm using PHP's virtual() function to perform a sub-request to Apache, in order initiate a file download (the files can be heavy, so I can't use readfile()). The files in question are stored in a non-public directory, since the user's permissions need to be checked before allowing a download.
My project is structured as follows:
project/
.htaccess -----> rewrite everything to index.php
index.php -----> framework + application logic + permissions check
private/
.htaccess -----> deny from all
... -----> private files
The first problem I encountered was that the sub-request generated by virtual() was getting rewritten by the first .htaccess, so the download never started. This was easy to fix, since there is a [NS] (no sub-request) rewrite flag that allows not to rewrite URLs in such cases:
...
RewriteRule ^ index.php [NS]
...
But I still can't make it work because of the other .htaccess (deny from all), that simply rejects all requests and sub-requests.
The question: is there any way to configure .htaccess to deny access from all, except when the request is actually a sub-request coming from the server itself?
First of all, a very interesting problem. Had to dig a bit to figure it out.
You can utilize apache_setenv function here.
Have this PHP code before virtual call:
apache_setenv('internal', '1'); // sets an Apache var with name internal
virtual ( "/private/file.txt" ); // example sub request
exit;
Now inside /private/.htaccess have this snippet:
Order deny,allow
Deny from all
Allow from env=internal
This will deny all requests except when it there is an env variable internal is set to 1. That internal variable is only getting set in your PHP code hence only sub-requests will be allowed and all others will be denied.
I have Apache 2.2 set up to accept PUTs and funnel them to a specific handler script /put.php as shown below in the Directory Directive in httpd.conf:
<Directory />
Options FollowSymLinks
AllowOverride All
Script PUT put.php
</Directory>
This has always worked, no matter the request as long as the method is PUT in the past. I used curl to validate this using a request URL of "/" which pointed to index.html.
I recently found a need to convert index.html to index.php to do some session handling, and suddenly my PUT requests stopped being handled by /put.php as soon as the file became index.php.
I realize that one solution is to point all PUT requests to /put.php, but we have an app that is hard coded to send them to / which doesn't work anymore since the change to index.php.
It'd be nice to be able to get index.php to still send PUT requests to it to /put.php, but I haven't been able to find a way.
The apache logs show that the PUT requests are being handled properly (201 response and no error), but the behavior is just that it never redirects to /put.php as it used to.
I also tried leaving the page as html, and adding the following line to the httpd.conf prior to the "Script PUT /put.php" directive:
AddType application/x-httpd-php .html
which then parsed the html page with the php parser, but then I got the same effect (No redirection to put.php) as when the page was called index.php and parsed by php.
Anyone have any ideas or encountered this before? It is as if when I turn index.html into index.php and send to the php parser it is unable to redirect any longer using the "Script PUT" directive.
I have always understood (unless im mistaken) that Apache's modrewrite engine requires
Options +FollowSymLinks
in order to work.
We have used modrewrite to hide the .php extension in addresses on a particular system in order to not reveal the chosen technology - PHP. We understand that one can still learn the server technology but you'd at least need to know how web servers work etc.
The problem is, the server tech's have brought up the risk in using +FollowSymLinks which i completely understand and agree with.
https://serverfault.com/questions/195570/htaccess-security
Aaron Copley: Symlinks aren't necessarily bad but you have to have a clear understanding of your implementation of Apache. To a non-chrooted
Apache, symlinks certainly pose a significant risk to exposing files
outside of your document root.
At the moment the system parses REQUEST_URI as such:
All rewrite rules are written to index.php
URL domain.com/request
REQUEST_URI = /request (trimmed as "request")
Using PHP switch () we check case 'request' : inlclude xyz.php;
exit;
This is a fairly common technique, but how would i implement the same result without the need for +FollowSymLinks and without having to go through every script in the system and change navigation links?
modrewrite will also work if you enable the following:
Options +SymlinksIfOwnerMatch
This causes Apache to check the owner of the link and the target, and only follows the link if the owners match.
Perhaps your server guys would accept that as a reduced risk?
More info here: http://onlamp.com/pub/a/apache/2004/02/19/apache_ckbk.html
The Apache documentation states
If your administrator has disabled override of FollowSymLinks for a user's directory, then you cannot use the rewrite engine. This restriction is required for security reasons.
Check this link:
http://httpd.apache.org/docs/current/mod/mod_rewrite.html
Ok I know im answering my own question, but im going out on a limb...
I should probably have mentioned before that the site will NOT be public as it is an administrative system so we don't care about search engines
Would i be able to do this instead of the existing implemented modrewrite:
.htaccess file:
ErrorDocument 404 /index.php
index.php
header("Status: 200 OK");
header("HTTP/1.0 200 OK");
I know this is messy, but we do not have time and the server tech guys will not budge, the $_SERVER['REQUEST_URI'] should still contain the same info???
Please feel free to comment and down/upvote, but please remember i know this is extremely cowboy and it's merely a temporary workaround
Important Note
POST requests do NOT work this way because Apache redirects to index.php (losing the POST data) you could still use GET info
The problem is the following:
There is one server that I deploy to and for some reason the server does not respond to urls as usual. What I mean is when I have a file called somefile.php uploaded to mysite.com/ and I type in browser mysite.com/somefile the file somefile.php gets called instead of saying 404 not found. I think that this is weird and for some reason it prevents my .htaccess file to rewrite correctly, because the file somefile.php gets called and if there is information after mysite.com/somefile like mysite.com/somefile/someotherfile, someotherfile gets ignored and somefile.php gets displayed. I have all other .htaccess files deleted even in parent directories of the server and still the same result. I hope that you can hep me.
On localhost this problem is not observed. I get 404 not found as I should...
Sounds like you have MultiViews currently enabled. Try disabling them.
Multiviews
MultiViews is a per-directory option, meaning it can be set
with an Options directive within a , or
section in httpd.conf, or (if AllowOverride is properly set) in
.htaccess files. Note that Options All does not set MultiViews; you
have to ask for it by name.
The effect of MultiViews is as follows: if the server receives a
request for /some/dir/foo, if /some/dir has MultiViews enabled, and
/some/dir/foo does not exist, then the server reads the directory
looking for files named foo.*, and effectively fakes up a type map
which names all those files, assigning them the same media types and
content-encodings it would have if the client had asked for one of
them by name. It then chooses the best match to the client's
requirements.
http://httpd.apache.org/docs/2.2/content-negotiation.html#multiviews
I am having what I believe is a strange problem. I have several sites developed on the same hosting platform. All site seem to be fine except for one of them. The website is set up around 1 page (index.php) that retrieves the correct data to display from the database based on the path_info - this has worked for years - now on one site this has stopped working. By stopped working I mean it the page below now goes to a 404 error - I was under the impress that it should see the index.php as the script to use.
I believe this is an issue with htconfig or another file I don't have access to being misconfigured on the host's end. Perhaps someone can shed light on where I might direct them. My own htaccess file is completely empty:
wwww.testsite.com/index.php/page1
The above used to go to index.php and then using $_SERVER path_info retrieve page1 and get the contents associated with page1 from the database and display that on the page. Can someone confirm I am not going mad - that the above should go to index.php please? and perhaps too explain why the url is now seen as non-existent since it doesn't seem to be going to index.php but to page1. Thanks in advance for any advice.
Can someone confirm I am not going mad - that the above [wwww.testsite.com/index.php/page1] should go to index.php please?
Nope. That should look for a file called page1 in the directory index.php in the document root for www.testsite.com.
I think you used to have an .htaccess file that looked something like this:
RewriteEngine on
RewriteRule ^index.php(.*)$ index.php
Another possibility is that MultiViews were previously enabled and now not anymore. With MultiViews you also get the behavior you described. If it's allowed by the hoster, you can enable it by simply creating an .htaccess file containing:
Options MultiViews
If you put an .htaccess file with either one of abovementioned solutions in it in your document root, you can verify this.
In Apache, if you have AcceptPathInfo on anywhere relevant in the Apache config (including in .htaccess, if the server config allows it) and there's a file /index.php, then /index.php/stuff should indeed go to /index.php, and should set $_SERVER['PATH_INFO'] to "/stuff". The CGI script handler and mod_php* even do this by default, so it should just work unless it's explicitly turned off.
Either way, if it's currently off, you can turn it back on by adding AcceptPathInfo on to your .htaccess file, if AllowOverride FileInfo is set for the site.
I make no promises about other web servers, but PATH_INFO is part of the CGI spec, so i'd think most servers would have a similar setting.