PHP Security - GET Variables, URL safe? - php

I am creating a very basic iPhone simulator and what I want to do is just have it in one location, and then any site that we have and want to put it on, we would just call it using: http://www.example.com/iphone-test.php?url=http://www.example.com/mobile/
Is there anything I need to look out for that could be un-safe? There is no database involved or anything, but just in case someone wanted to mess around and put some stuff in the URL, what are some things I can do to help make this a little more safe?
Here is my code:
<?php
if(isset($_GET['url'])) {
$url = $_GET['url'];
?>
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>iPhone Test</title>
<style type="text/css">
#iphone {
background:url(iPhone.png) no-repeat;
width:368px; height:706px;
position:relative;
overflow:hidden;
}
#iphone iframe {
position:absolute;
left:30px;
top:143px;
border:0;overflow:hidden;
}
</style>
</head>
<body>
<div id="iphone">
<iframe src="<?=$url;?>" width="307" height="443"><p>Your Browser does not support iFrames.</p></iframe>
</div>
</body>
</html>
<?php
}
?>
Edit: Thanks for all of your help. I did some research and here is what I have so far:
<?php
include_once 'filter.php';
$filter = new InputFilter();
if(isset($_GET['url'])) {
if (filter_var($_GET['url'], FILTER_VALIDATE_URL)) {
$url = $filter->process($_GET['url']);
?>
Source: http://oozman.com/php-tutorials/avoid-cross-site-scripting-attacks-in-php/
Class: http://www.phpclasses.org/browse/file/8941.html
What do you think?

You should use PHP's filter_var to check it's valid...
if (isset($_GET['url'])) {
if (filter_var($_GET['url'], FILTER_VALIDATE_URL)) {
$url = $_GET['url'];
}
}

If this page is accessible for anyone to access then you are opening yourself up to XSS and Phishing redirects. For example, try adding this to your URL params:
?url="></iframe><script>alert(123)</script>
In Firefox 6.02 that fires off the alert. Which means that any JS could be fired and used to redirect users who think they are visiting your site. Or it could be used to steal cookies that are not marked HTTPOnly.
This can be mitigated by encoding for HTML attributes. Which is described here from OWASP:
Except for alphanumeric characters, escape all characters with ASCII values less than 256 with the &#xHH; format (or a named entity if available) to prevent switching out of the attribute. The reason this rule is so broad is that developers frequently leave attributes unquoted. Properly quoted attributes can only be escaped with the corresponding quote. Unquoted attributes can be broken out of with many characters, including [space] % * + , - / ; < = > ^ and |.
Reference: https://www.owasp.org/index.php/XSS_(Cross_Site_Scripting)_Prevention_Cheat_Sheet#RULE_.232_-_Attribute_Escape_Before_Inserting_Untrusted_Data_into_HTML_Common_Attributes
Now, for your other issue that the above will nto address. If you allow just any arbitrary URL to be entered, then there is nothing stopping someone from doing something like this:
?url=http://myevilsite.com/redirect.php
And have that page redirect the user:
window.top.location.href = "http://www.site.com";
The only thing you can do about that is to use a white list of acceptable URLs.

The request method does not provide any securiy features, please see this answer
You need to be 100% sure beyond doubt that your code will not hijack that frame with javascript or lead the person to a malicious domain, or lead to a request form on your page that does something they don't want.
You can see some examples out here of that:
<IFRAME SRC="javascript:alert('XSS');"></IFRAME>
<iframe src=http://ha.ckers.org/scriptlet.html"></IFRAME>
You could have the person lead to post something as well...
Anyway, I can steal all your cookies, or trick the user into thinking they're still on your site but really be on mine where I get all their precious information.
Just stop, before you go any further you need to know what your doing, and you start by reading this.

You are vulnerable to cross site scripting.
All someone has to do is include "></iframe><script>EEEEVVvvvillll!!!! in the URL, or use a non-HTTP/HTTPS URI.

Here is a list for the security concerns involved here..
IFrames security summary
Wednesday, 24 October 2007 I’ve decided to collect the various proof
of concepts I’ve done and summarise why iframes are a security risk.
Here are the top reasons:-
Browser cross domain exploits
Description:- Because you can embed another web site inside your page,
you can exploit that page and perform actions as that user and doing
anything on a chosen web site.
Proof of concept:- Safari beta 3.03 zero day
XSS/CSRF reflection attacks
Description:- Using iframes embedded onto a compromised site an
attacker then can reflect attacks to other servers therefore making
attacks difficult to trace and having a focal point to conduct
attacks.
Proof of concept:- None available for this type of attack as it would
be difficult to show the method without actually conducting an attack.
CSS and iframes can scan your LAN from the internet!
Description:- By exploiting features in CSS and using iframes to check
if the default IP address exists, it’s possible to get your network
address range quite easily providing the network device uses the
default out of the box IP address.
Proof of concept:- CSS LAN scanner
LAN scanning with Javascript and iframes
Description:- Using a similar method as above it is possible to gain
your LAN information using Javascript.
Proof of concept:- Javascript LAN scanner
CSS iframe overlays
Description:- Iframes can be embedded inside each other in Firefox and
you can alter their appearance to create seamless overlays with any
site. This would make it very difficult for a user to know which site
they are interacting with and fool them to performing an action.
Proof of concept:- Verisign OpenID exploit (now fixed)
URL redirection
Description:- Iframes also allow you to perform redirection so you can
have access to URLs which normally wouldn’t be accessible. In the
delicious example, the POC redirects from delicious/home to your
account bookmarks and then uses CSS overlays to display your first
bookmark. Firefox and a delicious account are required for the POC.
Proof of concept:- Delicious CSS overlay/Redirection
Original here
You could make it a lot safer by acting like a proxy, load up the requested URL in PHP, strip everything out of the HTML (javascript etc) and then display it in the iframe pointed at a webpage on your server

Simply use strip_tags(), to void any evil script entry:
$url = strip_tags($_GET['url'])

Related

Detect SSL when proxy *always* claims a secure connection

I want to detect whether or not a user is viewing a secure page and redirect if not (for logging in).
However, my site travels through a proxy before I see the server variables and the proxy (right now) is telling me that $_SERVER['HTTPS'] is 'on' when the URI clearly indicates otherwise. It also shows 'on' when the user is navigating 'securely'.
Navigating through http:// and https:// both output that $_SERVER['SERVER_PORT'] = 443.
I don't have the ability to make any changes to the proxy so I want to know:
Does PHP have any other options for me to detect the truth or...
Am I stuck to resort to JavaScript's mechanisms for detection and redirection.
I mined this question for ideas but they mostly revolve around the $_SERVER['HTTPS'] variable being trustworthy. Bah!
It appears that this question is experiencing at least something similar, but s/he was able to resolve it by adapting an apache solution.
Are there any other PHP SERVER variables or tricks available to detect what the user's URI begins with? The only difference between the $_SERVER variables when my site is viewed http versus https are the following:
_FCGI_X_PIPE_ (appears random)
HTTP_COOKIE (sto-id-47873 is included in the non-secure version but I did not put it there)
REMOTE_ADDR (This and the next two keep changing inexplicably!)
REMOTE_HOST
REMOTE_PORT ('proxy people', why are you continually changing this?)
Are any of these items strong enough to put one's weight upon without it splintering and causing pain later? Perhaps I shouldn't trust anything as filtered through the proxy since it could change at any given time.
Here is my plan to use JavaScript for this purpose; is it the best I have?
function confirmSSL() {
if(location.protocol != "https:") {
var locale = location.href;
locale = locale.replace(/http:\/\//,"https://");
location.replace(locale);
}
}
<body onLoad="confirmSSL()">...
I think if the user has JavaScript disabled in my community, then they hopefully know what they are doing. They should be able to manually get themselves into a secure zone. What sort of <noscript> suggestions would be commonplace / good practice? Something like this, perhaps?:
<noscript>Navigate using https://blah.more.egg/fake to protect your information.</noscript>
PHP solutions that work (with good explanation) will be given preference for the correct answer. Feel free to submit a better JavaScript implementation or link to one.
Many thanks!
Although already partially discussed in the question's comments, I'll summarize some suggestions concerning the redirection logic in JavaScript:
Generally using window.location instead of location is advisable, an explanation can be found here.
Regex seems like a bit of an overkill for a simple replacement of the protocol.
The redirection logic should be executed as soon as possible, because in the event of redirection, every additional document processing is unnecessary.
Browsers with JavaScript disabled should at least show a notification prompting the user to switch to https.
I suggest using the following code (adopted from here), which is short and efficient:
<head>
<script type="text/javascript">
if (window.location.protocol != "https:") {
window.location.href = "https:" + window.location.href.substring(window.location.protocol.length);
}
</script>
...
</head>
<body>
...
<noscript>Please click here to use a secure connection!</noscript>
...
Just use the client-side approach. If your proxy is non-configurable, then that option is out. Detecting and redirecting via js is fine.
There is also a way to achieve the redirection without client side javascript. This method can be especially helpful, if JavaScript is disabled in the client's browser.
The steps are pure PHP and pretty simple:
Start a session
If it's a fresh session, redirect to the https location
If the session isn't new, it can be assumed, that the user has been redirected
Example code:
<?php
session_start();
if( !isset($_SESSION['runningOnHttps']) ) {
$_SESSION['runningOnHttps'] = true;
header('Location: https://my-cool-secure-site.com');
}
?>
Naturally, you could restrain this functionality to those browsers with JavaScript disabled in order to create a kind of 'hybrid mode': Whenever there is a new session with a non-JS browser, make a request to some kind of callback script that notifies the server to send a location header:
some_landingpage.php sends an initial <noscript> containing a hidden iframe, which will load redirect.php:
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<iframe src="redirect.php" style="visibility: hidden; position: absolute; left: -9000px;"></iframe>
</noscript>';
}
A request to redirect.php will let you know, that JavaScript is disabled and give you the chance to force redirection by sending a Location header (see above) with the next actual request.
As a matter of course, this method will only work reliably, if the protocol won't change (magically?) during one session.
UPDATE:
All of the above-mentioned method for handling non-JavaScript user agents could be rendered moot by an even neater approach:
I just learned, that <noscript> can also be included inside the <head>, which allows one to just redirect via <meta> tags.
Hence, some_landingpage.php could send an initial meta-refresh inside <noscript>:
// The following echo must appear inside the html head
if( !isset($_SESSION['checkWasMade']) ) {
$_SESSION['checkWasMade'] = true;
echo '<noscript>
<meta HTTP-EQUIV="REFRESH" content="0; url=https://my-cool-secure-site.com">
</noscript>';
}

Redirect a user to an external site while linking to an internal page?

How can I redirect a user to an external site while linking to an internal page ?
I have seen examples like:
example.com/go/ksdjfksjdhfls
example.com/?go=http://www.new-example.com
... And many more...
How this is achieved in php ?
Does this have any pros/cons with regards to SEO ?
I don't see any benefit in this approach, but there are a few ways to achieve it. To do it with the GET query, you would simply need the following code:
HTML:
Google!
PHP:
if (filter_var($_GET['site'], FILTER_VALIDATE_URL)) {
header('Location: ' . $_GET['site']);
}
With the above example, it will actually take the user to that location, not to:
http://example.com/link.php?site=http://www.google.com
To achieve the url being "local" while pulling up a remote site, you'd either have to:
Mess with URL rewriting, which can get messy and I'm not sure will let you do the above
Retrieve the remote page via curl and display it, which may screw up links on the "remote" page
Use iframes and set the iframe to be the size of the page. Note that this last method, while least offensive, is recognized as a potential security breach known as 'clickjacking' since it's used to trick users into clicking on a link for one page which his hiding a malicious link to another site. Many servers and browsers are taking steps to avoid this (for instance, google does not allow iframing of its home page), so this may also reach dead ends.
So of the three server-side methods I can think up, one may or may not be possible, and is a pain. One will be crippled and put a heavy load on the server. The last is a known bad guy and is likely not to work for many cases.
So I'd just go with a redirect, and really, if you don't need the address bar to show the local URL, then I'd just have a direct link.
All of the raises the question: What are you hoping to accomplish?
put this is beginning before any output to browser
<?
header('location:example.com\index.php');
?>
Set up an index php file which sets the header location to the url in the get parameter.
example.com/?go=http://www.new-example.com :
// example.com/index.php
<?php
if (isset($_GET['go'])) {
$go = $_GET['go'];
header('Location: $go');
} // else if other commands
// else (no command) load regular page
?>
example.com/go/ksdjfksjdhfls :
// example.com/go/ksdjfksjdhfls/index.php
<?php
header('Location: http://someurl.com');
?>
I use .htaccess rules for this. No PHP needed.
i.e.
Redirect 307 /go/somewhere-else http://www.my-affiliate-link.com/
So visiting http://www.mywebsite.com/go/somewhere-else will redirect to http://www.my-affiliate-link.com/.
On my site, I use "nofollow" to tell the search engines not to follow the link. The 307 status code means "temporary redirect".
Click here!
example.com/?go=http://www.new-example.com
you can use iframe and set the src attribute to http://www.new-example.com
<!DOCTYPE HTML>
<html>
<head>
</head>
<body>
<iframe src="http://www.new-example.com" width="100%" height="100%"></iframe>
</body>
</html>

HTML5 domain locking?

I've got a project where we're creating a dynamic html5 based video player with a bunch of Ajax controls and features. The intention is that this player will be used by other domains. Our old player used flash and could easily domain-lock, but now is there any method at all to do domain locking in HTML5?
Keep in mind that's its not just the video itself, we're also wanting to load html content for our ajax based controls. It seems like iframe is the obvious choice for this but then there's no way to do domain locking.
Any ideas?
You could use the function above, but its pretty obvious what it's doing, so anyone can just remove the domain lock.
There are services out there that will lock your page to a domain name, I know of two off the top of my head.
jscrambler.com - this is a paid tool, but it might be a bit of an overkill if all you want to do is lock your domain.
DomainLock JS - this is a free domain locking tool.
I came here looking for the same thing. But I think I have an answer worked out.
The best way I found sofar is to strip the location.href of its http:// and then check the first few characters for a whitelisted domain. So:
if(checkAllowedDomains())
{
initApplication();
}
else
{
// it's not the right domain, so redirect them!
top.location.href="http://www.snoep.at";
}
function checkAllowedDomains()
{
var allowed_domains=new Array();
allowed_domains.push("www.snoep.at");
allowed_domains.push("www.makinggames.nl");
allowed_domains.push("www.google.com");
// add whatever domain here!
var domain=top.location.href;
domain.replace('http://','');
var pass=false;
for(i=0;i<allowed_domains.length;i++)
{
var shortened_domain=domain.substr(2,allowed_domains[i].length);
if(shortened_domain.indexOf(allowed_domains[i])!=-1)
{
pass=true;
}
}
}
This bit of code checks several allowed_domains, you can easily extend the array.
That is the problem with the code, it's very readable. So, I'd advise you to put it through a js-minimizer to make it less obvious and include it in EVERY js on your page. InitApplication() is a function that starts your page or application.
Because you strip the location of http:// (which may or may not be there) and then checking only for the specific length (including the WWW!) of the allowed domain, you rule out subdomains, that might look like this: google.com.mydomain.com and throw the check of!
Hope this helps.
Try reading REFERER header, and if the site isn't blacklisted, don't display player.

a little question about XSS vulnerabilities

i want to use a css style switcher on a live server. i've been asking on irc and a guy said there is a big XSS hole. i don't know too much about XSS vulnerabilities. I hope someone can help me out to secure it!
i used this tutorial for the css switcher:
http://net.tutsplus.com/tutorials/javascript-ajax/jquery-style-switcher/
and the guy said the vulnerability is on the PHP side of the script!
Any tips would be highly appreciated!
Also if someone could re-make the whole script in a more secure way because - please do many ppl from net.tutsplus.com who use it would thank you!
The problem I see on this script is that there is a value that comes from the client $_COOKIE['style'] and is directly outputed in the page.
<link id="stylesheet" type="text/css" href="css/<?php echo $style ?>.css" rel="stylesheet" />
The cookie value is possible to highjack by external website in some particular condition. Also that cookie value is also set in the style-switcher.php without any filtering based on a GET parameter.
What I recommand you is to limit the possibility to style you want to be available.
Example :
<?php
if(!empty($_COOKIE['style'])) $style = $_COOKIE['style'];
else $style = 'day';
// List of possible theme //
$listStyle = array('day', 'night', 'foobar');
if (!in_array($style, $listStyle)) {
$style = $listStyle[0];
}
?>
href="css/<?php echo $style ?>.css"
Every time you output a text string to HTML, you must use htmlspecialchars() on the value, otherwise out-of-band characters like <, & or in this case " will break out of the context (attribute value) and allow an attacker to insert arbitrary HTML into the page, including JavaScript content.
This in an HTML-injection hole, and you should fix it. However it is not yet directly an XSS vulnerability, because the source of $style is $_COOKIE. That cookie must have been set by your scripts, or by the user himself; unlike $_GET and $_POST values, it cannot be set on a user's browser by a third party. It only becomes a vulnerability if you allow a third party to cause the user's cookies to be set to arbitrary values.
However, style-switcher.php does exactly that:
$style = $_GET['style'];
setcookie("style", $style, time()+604800);
with no checking to see that $style is an approved value, and no checking that the request to switch styles has been made by the user himself and not an arbitrary link on a third-party site: that is, it is vulnerable to XSRF. Say an attacker included a link or img src in a page visited by the user, pointing to:
http://www.example.com/style-switcher.php?style="><script>steal(document.cookie);</script>
that script will now be run each time the user loads a page from your site. (Technically, the ", ; and </> characters in this example need to be URL-encoded with %, but I've left it raw here for readability.)
(Also, you should really be using a POST form for this, since it has a side-effect action.)
What's worse, the same script contains a direct echoed injection which most definitely is XSS-vulnerable:
if(isset($_GET['js'])) {
echo $style;
}
This hole allows arbitrary content injection. Although normally you will be expecting this script with js to be called by an XMLHttpRequest (via jQuery get()), there is nothing stopping an attacker calling it by linking a user to a URL like:
http://www.example.com/style-switcher.php?style=<script>steal(document.cookie);</script>&js=on
and in response they'll get their <script> echoed back in a page that, with no Content-Type header explicitly set, will default to text/html, causing JavaScript of the attacker's choosing to be executed in your site's security context.
header("Location: ".$_SERVER['HTTP_REFERER']);
This is not a good idea for other reasons. The Referer header is not guaranteed to make its way to your server. To ensure this works, pass the URL-to-return-to in a parameter to the script.
I probably wouldn't bother with the PHP side to this. I'd just set the relevant cookie from JavaScript and location.reload() or manipulate the stylesheet DOM to update. The PHP script is an attempt to make the switcher work without JavaScript, but unless you can be bothered to implement it properly and securely with XSRF protection, it is a liability.
If you include your alternative style sheets as <link rel="alternate stylesheet"> you'll be giving stylesheet switching support to users without JavaScript anyway.
Subject your app to a comprehensive security test to see where the holes are. Try Skipfish.

In php, can parse_url and http_build_url be used to detect malformed urls and prevent xss attacks? Is there something better?

I want to allow users of my site to post urls. These urls would then be rendered on the site in the href attributes of a tags. Basically, user A posts a url, my site displays it on the page as an tag, then user B clicks it to see pictures of kittens.
I want to prevent javascript execution and xss attacks, and ensure there are no malformed urls in the output I generate.
Example: User A posts a malformed url, supposedly to pictures of kittens. My site tries to generate an tag from user A's data, then user B clicks the resulting link. User A has actually posted a malformed url which adds a javascript "onclick" event in the to send the victim's cookies to another site.
So I want to only allow correctly formed urls, and block out anything other than http/https protocols. Since I'm not allowing anything here which doesn't look like a url, and the user is not providing me html, it should be pretty simple to check by parsing and reforming the url.
My thinking is that parse_url should fail with an error on malformed urls, or it replaces illegal characters with '_'. I can check the separated parts of the url for allowed protocols as well. Then by constructing a url using http_build_url, I take the parts separated by parse_url and put them back together into a url which is known to be correctly formed. So by breaking them down this way first, I can give the user an error message when it fails instead of putting a sanitized broken url in my page.
The question is, will this prevent xss attacks from doing evil if a user clicks the link? Does the parsed and rebuilt url need further escaping? Is there a better way to do this? Shouldn't this be a solved problem by now with functions in the standard php libraries?
I really don't want to write a parser myself and I'm not going to even consider regular expressions.
Thanks!
What you need to do is just escape content properly when building your html. this means that when a value has a " in it, you build your html with "
Protecting against XSS isn't primarily about validating URL's it's about proper escaping. (although you probably want to be sure that it's a http: or https: link)
For a more detailed list of what to escape when building html strings (ie: the href attribute) see HTML, URL and Javascript Escaping
No, parse_url is not meant to be a URL validator.
You can use filter_var for this:
filter_var($someURL, FILTER_VALIDATE_URL);
So, in PHP, you would use something like:
<?php
$userlink = "http://google.com";
$newlink = htmlentities($userlink);
$link = "$newlink";
?>
Depending on a few other things, you might just validate the URL by checking if it points to any content. Here is an example:
figure 1
<?php
// URL to test
// $url = "";
$content = file_get_contents($url);
if(!empty($content)){
echo "Success:<br /><iframe src=\"$url\" style=\"height:400px; width:400px; margin:0px auto;\"></iframe>";
}else{
echo "Failed: Nothing exists at this url.";
}
?>
Curl is another option. With cURL you can just return http headers then check the error code it returns. ie Error 404 = page not found, 200 = OK, 201 = Created, 202 = Accepted, etc etc
Good luck!
~John
http://iluvjohn.com/

Categories