I have always understood (unless im mistaken) that Apache's modrewrite engine requires
Options +FollowSymLinks
in order to work.
We have used modrewrite to hide the .php extension in addresses on a particular system in order to not reveal the chosen technology - PHP. We understand that one can still learn the server technology but you'd at least need to know how web servers work etc.
The problem is, the server tech's have brought up the risk in using +FollowSymLinks which i completely understand and agree with.
https://serverfault.com/questions/195570/htaccess-security
Aaron Copley: Symlinks aren't necessarily bad but you have to have a clear understanding of your implementation of Apache. To a non-chrooted
Apache, symlinks certainly pose a significant risk to exposing files
outside of your document root.
At the moment the system parses REQUEST_URI as such:
All rewrite rules are written to index.php
URL domain.com/request
REQUEST_URI = /request (trimmed as "request")
Using PHP switch () we check case 'request' : inlclude xyz.php;
exit;
This is a fairly common technique, but how would i implement the same result without the need for +FollowSymLinks and without having to go through every script in the system and change navigation links?
modrewrite will also work if you enable the following:
Options +SymlinksIfOwnerMatch
This causes Apache to check the owner of the link and the target, and only follows the link if the owners match.
Perhaps your server guys would accept that as a reduced risk?
More info here: http://onlamp.com/pub/a/apache/2004/02/19/apache_ckbk.html
The Apache documentation states
If your administrator has disabled override of FollowSymLinks for a user's directory, then you cannot use the rewrite engine. This restriction is required for security reasons.
Check this link:
http://httpd.apache.org/docs/current/mod/mod_rewrite.html
Ok I know im answering my own question, but im going out on a limb...
I should probably have mentioned before that the site will NOT be public as it is an administrative system so we don't care about search engines
Would i be able to do this instead of the existing implemented modrewrite:
.htaccess file:
ErrorDocument 404 /index.php
index.php
header("Status: 200 OK");
header("HTTP/1.0 200 OK");
I know this is messy, but we do not have time and the server tech guys will not budge, the $_SERVER['REQUEST_URI'] should still contain the same info???
Please feel free to comment and down/upvote, but please remember i know this is extremely cowboy and it's merely a temporary workaround
Important Note
POST requests do NOT work this way because Apache redirects to index.php (losing the POST data) you could still use GET info
Related
In kohana framework, in .htaccess file is writen
# Protect application and system files from being viewed
RewriteRule ^(?:application|modules|system)\b.* index.php/$0 [L]
Ok, But why needed this security in each php file :
<?php defined('SYSPATH') or die('No direct access allowed.');
?
attacker alredy can not open any .php file directly right? (becuase reason is protecte from .htaccess RewriteRule)
Simpy: It's a fallback. Thats all, but I need more characters, to publish this answer
It seems like the developers wanted to make sure no files can be accessed, no matter if the .htaccess works or not (i.e. disabled mod_rewrite).
But for files that only contain class definitions or return/define configuration arrays it is pretty useless anyway, since they don't output anything.
You can't be sure - from a framework developer point of view - that the webserver, that your product will be run with, is correctly set up (e.g. .htaccess/RewriteEngine not enabled by AllowOverride or no mod_rewrite ...).
this kind of « second check » is there to ensure that the framework won't leak sensitive data even on badly set up hosting.
.htaccess works only on Apache with mod_rewrite enabled. If the server does not meet any of these condition those SYSPATH checks comes in handly.
Note: not every user can use Aapche as web-server. And Not every user has access to .htaccess.
There are other alternatives nowadays. Like, nginx.
I am having what I believe is a strange problem. I have several sites developed on the same hosting platform. All site seem to be fine except for one of them. The website is set up around 1 page (index.php) that retrieves the correct data to display from the database based on the path_info - this has worked for years - now on one site this has stopped working. By stopped working I mean it the page below now goes to a 404 error - I was under the impress that it should see the index.php as the script to use.
I believe this is an issue with htconfig or another file I don't have access to being misconfigured on the host's end. Perhaps someone can shed light on where I might direct them. My own htaccess file is completely empty:
wwww.testsite.com/index.php/page1
The above used to go to index.php and then using $_SERVER path_info retrieve page1 and get the contents associated with page1 from the database and display that on the page. Can someone confirm I am not going mad - that the above should go to index.php please? and perhaps too explain why the url is now seen as non-existent since it doesn't seem to be going to index.php but to page1. Thanks in advance for any advice.
Can someone confirm I am not going mad - that the above [wwww.testsite.com/index.php/page1] should go to index.php please?
Nope. That should look for a file called page1 in the directory index.php in the document root for www.testsite.com.
I think you used to have an .htaccess file that looked something like this:
RewriteEngine on
RewriteRule ^index.php(.*)$ index.php
Another possibility is that MultiViews were previously enabled and now not anymore. With MultiViews you also get the behavior you described. If it's allowed by the hoster, you can enable it by simply creating an .htaccess file containing:
Options MultiViews
If you put an .htaccess file with either one of abovementioned solutions in it in your document root, you can verify this.
In Apache, if you have AcceptPathInfo on anywhere relevant in the Apache config (including in .htaccess, if the server config allows it) and there's a file /index.php, then /index.php/stuff should indeed go to /index.php, and should set $_SERVER['PATH_INFO'] to "/stuff". The CGI script handler and mod_php* even do this by default, so it should just work unless it's explicitly turned off.
Either way, if it's currently off, you can turn it back on by adding AcceptPathInfo on to your .htaccess file, if AllowOverride FileInfo is set for the site.
I make no promises about other web servers, but PATH_INFO is part of the CGI spec, so i'd think most servers would have a similar setting.
I use Minify to minify and cache all my script requests. I only want my users to be able to access the minified versions of the JavaScript files.
Minify lies at www.example.com/min and my scripts are at www.example.com/scripts. How can I block direct access to doc_root/scripts which is where my unminified JavaScript files lie. I'd rather not put them out of the document root but it's an option.
Please note that I'm using Zend Framework, so the actual root of my application is shifted to www.example.com/public. An htaccess file handles the rewrite.
Can't you just use an .htaccess file inside doc_root/scripts to prevent all access over the web to .js files over HTTP?
It won't stop minify, since that provides indirect access.
So in doc_root/scripts/.htaccess, something along the lines of
<Files ~ "\.js$">
order allow,deny
deny from all
</Files>
Note that the location of the .htaccess file matters in this case.
You effectively can't block end-user facing code. Even if you served it with PHP or another server-side language and blocked direct requests, it's of course still possible to read it directly with a number of tools.
You should code with this in mind and be mindful with javascript comments, business knowledge, etc.
UPDATE:
However, if you're talking about code that doesn't ever need to be accessed by an end-user, you could as you mentioned move it out of the server root, or you can block the files in your directory (or an entire directory). It's easy with Apache's .htaccess.
order deny, allow
deny from all
You could also redirect the source files to the minified versions with mod_rewrite in your .htaccess file.
RewriteEngine On
RewriteRule /scripts/(.*)$ /min/$1 [L,NC]
Depends on the server you're using. Assuming it's Apache, you can add this to your .htaccess file:
<Directory ~ "\scripts">
Order allow,deny
Deny from all
</Directory>
Or something to that effect..
The only way is to check referers, and not everyone sends them, or sends a real one. In other words, you can't block direct access to anyone who really wants something. It's impossible to determine with 100% accuracy if a request is a direct one or is being done via a <script src=....> type request.
For your Javascript to actually run the user's browser must be able to read it ultimately.
As such there's no real way to "block" access to your scripts folder (well to be precise you can but that would break your website since the browser would not see the files in order to run them.)
One solution could be obfuscation, which makes the javascript code harder to read / understand but ultimately the user will see the code, and with a bit of persevering reverse engineering it can be de-obfuscated.
Another thing i've seen someone do is creating an "empty" js.html page, and insert all their javascript into script tags in the page (embedded, not external), and from his main page make ann ajax request to js.html and embed it at the bottom of the page. kind of a round about way but the user will not see the js when viewing the source unless using developper tools such as firebug.
Note that the last option also might cause some delay depending on the abount of code you are loading. but here the key is not blocking access to your scripts, but just making them harder to obtain / read / copy.
Edit: oops, misread as well. I think the best solution in this case would be to go with an htaccess file in your scripts folder denying all access
This answer is little bit newer, than question (only several years, that’s nothing)
You cannot deny access to JavaScript file, because they wont’t be accessible from <script> tag.
But I found a workaround:
RewriteEngine On
RewriteRule ^.*\.js$ /invalid.html [R=301,L]
Place it in your .htaccess file in your home folder of web. (under htdocs or public_html).
This will automatically redirect everyone from it. So they don’t see it.
What is the best way to dynamically redirect a friendly looking url on Linux to dynamic on Windows?
ex.
domainone.com/dir/one/two
redirects to
domaintwo.com/index.aspx?a=one&b=two
and
domainone.com/dir/three/four
redirects to
domaintwo.com/index.aspx?a=three&b=four
The HTTP header called Location is how you redirect a user across hosts/domains.
Depending on your server configuration and the mechanism used to generate the HTTP headers, the specific implementation will vary. An example in PHP (as your question appears to be tagged) is to include the following code:
header('Location: http://domaintwo.com/index.aspx?a=one&b=two');
The string above is like any other string, so apply the appropriate logic to provide the desired redirection URL.
The same effect is also possible in the domain configuration files (the precise path differs across server software and operating system) or more conventionally in .htaccess files. If you provide more information about your hosting environment, someone will be able to help you devise the rewrite rule you need. I prefer to put this level of smart rewriting in a PHP script, since I think .htaccess files tend to be harder to manage and "read".
From within Apache:
Either in a server configuration file, or more likely in an .htaccess file.
You can use mod_rewrite to do this, but as you want a redirect, it would be more appropriate to use mod_alias and the RedirectMatch statement.
RedirectMatch 301 ^/a/([^/]+)/([^/]+)$ http://domaintwo.com/index.aspx?a=$1&b=$2
Rewrite variant:
RewriteEngine On
RewriteBase /
RewriteRule ^/a/([^/]+)/([^/]+)$ http://domaintwo.com/index.aspx?a=$1&b=$2 [R=301,L]
Note the use of 301, that is a permanent redirect, use 302 for temporary, or when you always want people to redirect rather than going directly on future accesses.
A pretty standard way to approach this on Linux is to use Apache with mod_rewrite (how to install and configure Apache and mod_rewrite, if it's not already set up, will depend on your Linux distribution)
Then, in your Apache configuration, you can add a line like:
RewriteEngine On
RewriteRule ^/a/([^/]+)/([^/]+) http://www.domaintwo.com/index.aspx?a=$1&b=$2
Without a possibility to access .htaccess I find myself in a creative impasse. There is no mod_rewriting for me. Nevertheless, I want to be able to do the nice stuff like:
http://www.example.com/Blog/2009/12/10/
http://www.example.com/Title_Of_This_Page
What are my alternatives?
In respond to the answers:
I'm building with php5
I don't have access to .htaccess
http://www.example.com/index.php/Blog/ is a known technique but I don't prefer it. Is shows the php so to say.
How would I create extensionless PHP-files? Would this do the trick?
How much would using the custom 404 technique hurt performance?
If you've the permissions to set custom error documents for your server you could use this to redirect 404 requests.
E.g. for Apache (http://httpd.apache.org/docs/2.0/mod/core.html#errordocument)
ErrorDocument 404 /index.php
In the index.php you then can proceed your request by using data from the $_SERVER array.
You can also have urls like
http://domain.com/index.php/Blog/Hello_World
out of the box with PHP5. You can then read the URL parameters using
echo $_SERVER['PATH_INFO'];
Remember to validate/filter the PATH_INFO and all other request variables before using them in your application.
I know this question is very old but I didn't see anyone else suggest this possible solution...
You can get very close to what you want just by adding a question mark after the domain part of the URL, ie;
http://www.example.com/?Blog/2009/12/10/
http://www.example.com/?Title_Of_This_Page
Both of the above HTTP requests will now be handled by the same PHP script;
www.example.com/index.php
and in the index.php script, $_SERVER['REQUEST_URI'] for the two pages above will be respectively;
Blog/2009/12/10/
Title_Of_This_Page
so you can handle them however you want.
A quite simple way is to:
declare a 404 ErrorDocument (e.g. PHP) in .htaccess
parse the query using $_SERVER and see if it corresponds to any result
if so replace the HTTP status 404 with status 200 using header() and include index.php
If you omit a trailing slash, Apache will serve the first file [alphabetically] which matches that name, regardless of the extension, at least on the 2 servers I have access to.
I don't know how you might use this to solve your problem, but it may be useful at some point.
For example if
http://www.somesite.com/abc.html and http://www.somesite.com/abc.php both exist and http://www.somesite.com/abc is requested, http://www.somesite.com/abc.html will be served.
The only way is to use custom 404 page. You have no possibility to interpret extensionless files with PHP interpreter without reconfiguring the web server's MIME-types. But you say that you can't edit even .htaccess, so there's no other way.
You can write a URI class which parses the user-friendly URL you defined.
If the MultiViews option is enabled or you can convince whoever holds the keys to enable it, you can make a script called Blog.php that will be passed requests to example.com/Blog/foo and get '/foo' in the $_SERVER['PATH_INFO'].