Sanitize file path in PHP - php

I'm hoping to make my tiny program secure so that potential malicious users cannot view sensitive files on the server.
$path = "/home/gsmcms/public_html/central/app/webroot/{$_GET['file']}";
if(file_exists($path)) {
echo file_get_contents($path);
} else {
header('HTTP/1.1 404 Not Found');
}
Off the top of my head I know that input such as '../../../../../../etc/passwd' would be trouble, but wondering what other malcious inputs I should expect and how to prevent them.

realpath() will let you convert any path that may contain relative information into an absolute path...you can then ensure that path is under a certain subdirectory that you want to allow downloads from.

Use basename rather than trying to anticipate all the insecure paths a user could provide.

Solution by the OP:
$baseDir = "/home/gsmcms/public_html/central/app/webroot/";
$path = realpath($baseDir . $_GET['file']);
// if baseDir isn't at the front 0==strpos, most likely hacking attempt
if(strpos($path, $baseDir) !== 0) {
die('Invalid Path');
} elseif(file_exists($path)) {
echo file_get_contents($path);
} else {
header('HTTP/1.1 404 Not Found');
echo "The requested file could not be found";
}

If you can, use a whitelist like an array of allowed files and check the input against that: if the file asked by the user isn't present in that list, deny the request.

There is an additional and significant security risk here. This script will inject the source of a file into the output stream without any server-side processing. This means that all your source code of any accessible files will be leaked to the internet.

Even if you are using realpath, you should still strip all ".." before using it. Otherwise an attacker can read your servers entire directory structure with brute force, e.g. "valid_folder/../../test_if_this_folder_name_exists/valid_folder" - if the application accepts this path, the attacker knows that the folder exists.

Another approach:
$path = "/app/webroot/{$_GET['file']}";
$realTarget = realpath($path);
if( strtolower($path) !== strtolower($realTarget) ) {
// invalid path!
}
// life goes on

I think this is the best answer for PHP7.
This will only allow people to see files they have the absolute path to.
It won't let people fish for valid filenames outside the specified path by making all failure conditions report the same.
$base_dir = $temp_path;
$path = "";
if(isset($_GET['filename'])) {
$path = realpath($base_dir.$_GET['filename']);
//realpath returns false if the file doesnt exist
if(!$path ||
//dont look outside temp path
substr($path, 0, strlen($base_dir)) != $base_dir){
header('HTTP/1.1 404 Not Found');
echo "The requested file could not be found";
die;
}
}

To strip all /. /.. or \. \.. and convert to all forward slash because the different environments will accept forward slash. This should provide a fairly safe filter for path input. In your code you should be comparing it to parent directories that you do not want access just in case.
$path = realpath(implode('/', array_map(function($value) {return trim($value, '.');}, explode('/', str_replace('\\', '/', $path)))));

Related

Checking permission for a path [duplicate]

I have a base path /whatever/foo/
and
$_GET['path'] should be relative to it.
However how do I accomplish this (reading the directory), without allowing directory traversal?
eg.
/\.\.|\.\./
Will not filter properly.
Well, one option would be to compare the real paths:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strpos($realUserPath, $realBase) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
Basically, realpath() will resolve the provided path to an actual hard physical path (resolving symlinks, .., ., /, //, etc)... So if the real user path does not start with the real base path, it is trying to do a traversal. Note that the output of realpath will not have any "virtual directories" such as . or .....
ircmaxell's answer wasn't fully correct. I've seen that solution in several snippets but it has a bug which is related to the output of realpath(). The realpath() function removes the trailing directory separator, so imagine two contiguous directories such as:
/foo/bar/baz/
/foo/bar/baz_baz/
As realpath() would remove the last directory separator, your method would return "good path" if $_GET['path'] was equal to "../baz_baz" as it would be something like
strpos("/foo/bar/baz_baz", "/foo/bar/baz")
Maybe:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strcmp($realUserPath, $realBase) !== 0 || strpos($realUserPath, $realBase . DIRECTORY_SEPARATOR) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
It is not sufficient to check for patterns like ../ or the likes. Take "../" for instance which URI encodes to "%2e%2e%2f". If your pattern check happens before a decode, you would miss this traversal attempt. There are some other tricks hackers can do to circumvent a pattern checker especially when using encoded strings.
I've had the most success stopping these by canonicalizing any path string to its absolute path using something like realpath() as ircmaxwell suggests. Only then do I begin checking for traversal attacks by matching them against a base path I've predefined.
You may be tempted to try and use regex to remove all ../s but there are some nice functions built into PHP that will do a much better job:
$page = basename(realpath($_GET));
basename - strips out all directory information from the path e.g. ../pages/about.php would become about.php
realpath - returns a full path to the file e.g. about.php would become /home/www/pages/about.php, but only if the file exists.
Combined they return just the files name but only if the file exists.
When looking into the creation of new files or folders, I've figured I can use a two stage approach:
First check for traversal attempts using a custom implementation of a realpath() like function, which however works for arbitrary paths, not just existing files. There's a good starting point here. Extend it with urldecode() and whatever else you think may worth checking.
Now using this crude method you can filter out some traversal attempts, but it may be possible that you miss some hackish combination of special characters, symlinks, escaping sequences etc. But since you know for sure the target file does not exist (check using file_exists) noone can overwrite anything. The worst case scenario would be that someone can get your code creating a file or folder somewhere, which may be an acceptable risk in most cases, provided your code does not allow them to write into that file/folder straight away.
Finally so the path now points to an existing location, therefore you can now do the proper check using the methods suggested above utilising realpath(). If at this point it turns out a traversal has happened, you are still safe more or less, as long as you make sure to prevent any attempts writing into the target path. Also right now you can delete the target file/dir and say it was a traversal attempt.
I'm not saying it cannot be hacked, since after all still it may allow illegitimate changes to be done to the FS, but still better than only doing custom checks, that cannot utilise realpath(), and the window for abuse left open by making a temporary and empty file or folder somewhere is lower, than allowing them to make it permanent and even write into it, as it would happen with only a custom check that may miss some edge cases.
Also correct me if I'm wrong pls!
I have written a function to check for traversal:
function isTraversal($basePath, $fileName)
{
if (strpos(urldecode($fileName), '..') !== false)
return true;
$realBase = realpath($basePath);
$userPath = $basePath.$fileName;
$realUserPath = realpath($userPath);
while ($realUserPath === false)
{
$userPath = dirname($userPath);
$realUserPath = realpath($userPath);
}
return strpos($realUserPath, $realBase) !== 0;
}
This line alone if (strpos(urldecode($fileName), '..') !== false) should be enough to prevent traversal, however, there are many different ways hackers can traverse directories so its better to make sure the user starts with the real base path.
Just checking the user starts with the real base path is not enough because a hacker could traverse to the current directory and discover the directory structure.
The while allows the code to work when $fileName does not exist.
1
put a null index.htm for -Index block
2
filter sQS on start
// Path Traversal Attack
if( strpos($_SERVER["QUERY_STRING"], "../") ){
exit("P.T.A. B-(");
}

How can I detect whether a path escapes the current directory in php? [duplicate]

I have a base path /whatever/foo/
and
$_GET['path'] should be relative to it.
However how do I accomplish this (reading the directory), without allowing directory traversal?
eg.
/\.\.|\.\./
Will not filter properly.
Well, one option would be to compare the real paths:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strpos($realUserPath, $realBase) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
Basically, realpath() will resolve the provided path to an actual hard physical path (resolving symlinks, .., ., /, //, etc)... So if the real user path does not start with the real base path, it is trying to do a traversal. Note that the output of realpath will not have any "virtual directories" such as . or .....
ircmaxell's answer wasn't fully correct. I've seen that solution in several snippets but it has a bug which is related to the output of realpath(). The realpath() function removes the trailing directory separator, so imagine two contiguous directories such as:
/foo/bar/baz/
/foo/bar/baz_baz/
As realpath() would remove the last directory separator, your method would return "good path" if $_GET['path'] was equal to "../baz_baz" as it would be something like
strpos("/foo/bar/baz_baz", "/foo/bar/baz")
Maybe:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strcmp($realUserPath, $realBase) !== 0 || strpos($realUserPath, $realBase . DIRECTORY_SEPARATOR) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
It is not sufficient to check for patterns like ../ or the likes. Take "../" for instance which URI encodes to "%2e%2e%2f". If your pattern check happens before a decode, you would miss this traversal attempt. There are some other tricks hackers can do to circumvent a pattern checker especially when using encoded strings.
I've had the most success stopping these by canonicalizing any path string to its absolute path using something like realpath() as ircmaxwell suggests. Only then do I begin checking for traversal attacks by matching them against a base path I've predefined.
You may be tempted to try and use regex to remove all ../s but there are some nice functions built into PHP that will do a much better job:
$page = basename(realpath($_GET));
basename - strips out all directory information from the path e.g. ../pages/about.php would become about.php
realpath - returns a full path to the file e.g. about.php would become /home/www/pages/about.php, but only if the file exists.
Combined they return just the files name but only if the file exists.
When looking into the creation of new files or folders, I've figured I can use a two stage approach:
First check for traversal attempts using a custom implementation of a realpath() like function, which however works for arbitrary paths, not just existing files. There's a good starting point here. Extend it with urldecode() and whatever else you think may worth checking.
Now using this crude method you can filter out some traversal attempts, but it may be possible that you miss some hackish combination of special characters, symlinks, escaping sequences etc. But since you know for sure the target file does not exist (check using file_exists) noone can overwrite anything. The worst case scenario would be that someone can get your code creating a file or folder somewhere, which may be an acceptable risk in most cases, provided your code does not allow them to write into that file/folder straight away.
Finally so the path now points to an existing location, therefore you can now do the proper check using the methods suggested above utilising realpath(). If at this point it turns out a traversal has happened, you are still safe more or less, as long as you make sure to prevent any attempts writing into the target path. Also right now you can delete the target file/dir and say it was a traversal attempt.
I'm not saying it cannot be hacked, since after all still it may allow illegitimate changes to be done to the FS, but still better than only doing custom checks, that cannot utilise realpath(), and the window for abuse left open by making a temporary and empty file or folder somewhere is lower, than allowing them to make it permanent and even write into it, as it would happen with only a custom check that may miss some edge cases.
Also correct me if I'm wrong pls!
I have written a function to check for traversal:
function isTraversal($basePath, $fileName)
{
if (strpos(urldecode($fileName), '..') !== false)
return true;
$realBase = realpath($basePath);
$userPath = $basePath.$fileName;
$realUserPath = realpath($userPath);
while ($realUserPath === false)
{
$userPath = dirname($userPath);
$realUserPath = realpath($userPath);
}
return strpos($realUserPath, $realBase) !== 0;
}
This line alone if (strpos(urldecode($fileName), '..') !== false) should be enough to prevent traversal, however, there are many different ways hackers can traverse directories so its better to make sure the user starts with the real base path.
Just checking the user starts with the real base path is not enough because a hacker could traverse to the current directory and discover the directory structure.
The while allows the code to work when $fileName does not exist.
1
put a null index.htm for -Index block
2
filter sQS on start
// Path Traversal Attack
if( strpos($_SERVER["QUERY_STRING"], "../") ){
exit("P.T.A. B-(");
}

PHP Include based on REQUEST_URI

Is there any security risks involved in using $_SERVER['REQUEST_URI'] to include a file? Can you pass ../../.. through the request uri somehow?
What I'm thinking is something like this:
<?php
$path = $_SERVER['REQUEST_URI'];
$path = preg_replace('~\\.html?$~', '.php', $path);
include $path;
?>
This should substitute ".htm" or ".html" URIs for ".php" paths and render them. But I am concerned about security here.
$_SERVER['REQUEST_URI'] contains the requested URI path and query string as it appeared in the HTTP request line. So when http://example.com/foo/bar?baz=quux is requested and the server passes the request to the script file /request_handler.php, $_SERVER['REQUEST_URI'] would still be /foo/bar?baz=quux. I’ve used mod_rewrite to map any request to request_handler.php as follows:
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ /request_handler.php
So for correctness, before using $_SERVER['REQUEST_URI'] in a file system path, you would need to get rid of the query string. You can use parse_url to do so:
$_SERVER['REQUEST_URI_PATH'] = parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH);
But as this value comes directly from the HTTP request line without prior path resolution, it may still contain symbolic path segments like ...
However, path traversal is not even necessary as the requested URI path is already an absolute path reference and requesting http://example.com/etc/passwd should result in the inclusion of /etc/passwd.
So this is actually a local file inclusion vulnerability.
Now to fix this, requiring a certain root directory using the method that you, chowey, have presented is a good improvement. But you would actually need to prefix it with $basedir:
$path = realpath($basedir . $_SERVER['REQUEST_URI_PATH']);
if ($path && strpos($path, $basedir) === 0) {
include $path;
}
This solution provides several promises:
$path is either a valid, resolved path to an existing file, or false;
inclusion of that file does only happen if $basedir is a prefix path of $path.
However, this may still allow access to files which are protected using some other kind of access control like the one provided by Apache’s mod_authz_host module.
This does not actually answer the question...
Note that you can ensure the request uri points to an actual valid filepath in the current working directory. You can use the realpath function to normalize the path.
The following code would do the trick:
<?php
$basedir = getcwd();
$path = $_SERVER['REQUEST_URI'];
$path = preg_replace('~\\.html?$~', '.php', $path);
$path = realpath($path);
if ($path && strpos($path, $basedir) === 0) {
include $path;
} else {
return false;
}
?>
Here I used strpos to verify that the $path starts with $basepath. Since realpath will have removed any ../../.. funny business, this should safely keep you within the $basepath directory.
Indeed dont trust the $_SERVER['REQUEST_URI'] before checking the basepath.
And dont make a filter that removes ../ from the path attackers can craft a nieuw way to inject if they understand the filter proces.
I had the same question and chose to do the following basing myself on Gumbo's answer :
$_SERVER['REQUEST_URI_PATH'] = parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH);
$path = realpath(FOLDER_ROOT . $_SERVER['REQUEST_URI_PATH']);
$directory_white_list_array = array('/safe_folder1', '/safe_folder1/safe_folder2');
if ($path && strpos($path, FOLDER_ROOT) === 0 && (in_array(dirname($path), $directory_white_list_array) && ('php' == pathinfo($path, PATHINFO_EXTENSION)))) {
include $path;
}else{
require_once FOLDER_ROOT."/miscellaneous_functions/navigation_error.php";
navigation_error('1');
}
Summary:
Added directory whitelist and .php extension restriction.

Preventing Directory Traversal in PHP but allowing paths

I have a base path /whatever/foo/
and
$_GET['path'] should be relative to it.
However how do I accomplish this (reading the directory), without allowing directory traversal?
eg.
/\.\.|\.\./
Will not filter properly.
Well, one option would be to compare the real paths:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strpos($realUserPath, $realBase) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
Basically, realpath() will resolve the provided path to an actual hard physical path (resolving symlinks, .., ., /, //, etc)... So if the real user path does not start with the real base path, it is trying to do a traversal. Note that the output of realpath will not have any "virtual directories" such as . or .....
ircmaxell's answer wasn't fully correct. I've seen that solution in several snippets but it has a bug which is related to the output of realpath(). The realpath() function removes the trailing directory separator, so imagine two contiguous directories such as:
/foo/bar/baz/
/foo/bar/baz_baz/
As realpath() would remove the last directory separator, your method would return "good path" if $_GET['path'] was equal to "../baz_baz" as it would be something like
strpos("/foo/bar/baz_baz", "/foo/bar/baz")
Maybe:
$basepath = '/foo/bar/baz/';
$realBase = realpath($basepath);
$userpath = $basepath . $_GET['path'];
$realUserPath = realpath($userpath);
if ($realUserPath === false || strcmp($realUserPath, $realBase) !== 0 || strpos($realUserPath, $realBase . DIRECTORY_SEPARATOR) !== 0) {
//Directory Traversal!
} else {
//Good path!
}
It is not sufficient to check for patterns like ../ or the likes. Take "../" for instance which URI encodes to "%2e%2e%2f". If your pattern check happens before a decode, you would miss this traversal attempt. There are some other tricks hackers can do to circumvent a pattern checker especially when using encoded strings.
I've had the most success stopping these by canonicalizing any path string to its absolute path using something like realpath() as ircmaxwell suggests. Only then do I begin checking for traversal attacks by matching them against a base path I've predefined.
You may be tempted to try and use regex to remove all ../s but there are some nice functions built into PHP that will do a much better job:
$page = basename(realpath($_GET));
basename - strips out all directory information from the path e.g. ../pages/about.php would become about.php
realpath - returns a full path to the file e.g. about.php would become /home/www/pages/about.php, but only if the file exists.
Combined they return just the files name but only if the file exists.
When looking into the creation of new files or folders, I've figured I can use a two stage approach:
First check for traversal attempts using a custom implementation of a realpath() like function, which however works for arbitrary paths, not just existing files. There's a good starting point here. Extend it with urldecode() and whatever else you think may worth checking.
Now using this crude method you can filter out some traversal attempts, but it may be possible that you miss some hackish combination of special characters, symlinks, escaping sequences etc. But since you know for sure the target file does not exist (check using file_exists) noone can overwrite anything. The worst case scenario would be that someone can get your code creating a file or folder somewhere, which may be an acceptable risk in most cases, provided your code does not allow them to write into that file/folder straight away.
Finally so the path now points to an existing location, therefore you can now do the proper check using the methods suggested above utilising realpath(). If at this point it turns out a traversal has happened, you are still safe more or less, as long as you make sure to prevent any attempts writing into the target path. Also right now you can delete the target file/dir and say it was a traversal attempt.
I'm not saying it cannot be hacked, since after all still it may allow illegitimate changes to be done to the FS, but still better than only doing custom checks, that cannot utilise realpath(), and the window for abuse left open by making a temporary and empty file or folder somewhere is lower, than allowing them to make it permanent and even write into it, as it would happen with only a custom check that may miss some edge cases.
Also correct me if I'm wrong pls!
I have written a function to check for traversal:
function isTraversal($basePath, $fileName)
{
if (strpos(urldecode($fileName), '..') !== false)
return true;
$realBase = realpath($basePath);
$userPath = $basePath.$fileName;
$realUserPath = realpath($userPath);
while ($realUserPath === false)
{
$userPath = dirname($userPath);
$realUserPath = realpath($userPath);
}
return strpos($realUserPath, $realBase) !== 0;
}
This line alone if (strpos(urldecode($fileName), '..') !== false) should be enough to prevent traversal, however, there are many different ways hackers can traverse directories so its better to make sure the user starts with the real base path.
Just checking the user starts with the real base path is not enough because a hacker could traverse to the current directory and discover the directory structure.
The while allows the code to work when $fileName does not exist.
1
put a null index.htm for -Index block
2
filter sQS on start
// Path Traversal Attack
if( strpos($_SERVER["QUERY_STRING"], "../") ){
exit("P.T.A. B-(");
}

How do I make sure a file path is within a given subdirectory?

I want to make sure a file path set via query string does not go outside of the desired subdirectory. Right now, I am checking that:
The path does not start with "/", to prevent the user from giving an absolute path.
The path does not contain "..", to prevent the user from giving a path that is outside of the desired subdirectory.
The path does not contain ":", to prevent the use of a url (i.e. "http://", "ftp://", etc.). Should I ever run this script on a Windows server (not likely), this will also prevent absolute paths beginning with a drive specifier (i.e. "C:\"). Note: I'm aware that a colon is a valid character in a Unix filenames, but I will never be using it in a filename.
The path does not start with "\". Just in case I change my mind about running on a Windows server, this prevents Windows network paths from being specified (i.e. "\\someserver\someshare"). Again, I'm aware that a backslash is a valid Unix filename character, but I also won't be using it in any filenames.
Are these checks sufficient?
Background
I have a PHP script that takes (via query string) the path to a sample source file to be shown to a user. So I might give them a link like "view_sample.php?path=accounting_app/report_view.php" or "view_sample.php?path=ajax_demo/get_info.js".
The script looks basically like this:
$path = $_GET['path'];
if(path_is_valid($path) && is_file("sample/$path"))
{
header('Content-Type: text/plain');
readfile("sample/$path");
}
My concern is that a malicious user would see the url and try to do something like "view_sample.php?path=../../database/connection_info.php" and gain access to a file which is not in the "sample" directory.
Are the four checks I defined above (which would be implemented in the path_is_valid() function) sufficient to lock out a malicious user? (Also, I think checks 1, 3, and 4 are basically irrelevant since I am prepending a relative path, but if I didn't do this would the checks be sufficient?)
Call
$path = realpath("sample/$path");
Then check that the resulting path starts with the directory you're expecting.
<?php
// Current path information
$path = $_GET['path'];
$vroot = "sample";
// Validate that the $path is a subfolder of $vroot
$vroot = realpath($vroot);
if(substr(realpath($path), 0, strlen($vroot)) != $vroot or !is_dir($path)) {lid!
exit("Invalid path");
} else {
echo "Ah, everything is alright!";
}
?>
The use of realpath should not change the path, so I use it in the following way:
function checkPath($pathToCheck) {
global $basepath;
$fullpath = $basepath.'/'.$pathToCheck;
if ($fullpath==realpath($fullpath) && is_dir($fullpath)) {
return $fullpath;
} else {
error_die('path not allowed: '.htmlentities($pathToCheck));
}
}

Categories