defending against .html and .php file name guessing - php

If certain .html files can only be accessed by a password match (implemented in PHP) to a hash code in a database, the user can still guess likely .html file names and see that supposedly privileged page. Viewing the source of the privileged page, the user can then see the name of a .php that is invoked in that .html which might lead to the guessing of the likely POST arguments.
What is the best practice to reduce the temptation to do this type of guessing of names both of the .html and .php file types.
The .htaccess file already has "options -indexes" to prevent listing directories.
Edit: ummm,instead of upvoting that it's a bad implementation, why not upvote one of the suggested answers or write a new one. It's obvious that it's a bad implementation, that's why this question was posted.

If you only ever include these pages inside other pages, deny access to them in .htaccess
If you want them to be accessible, but only to authorized users, password protect it or provide other authentcation
Preventing people guessing the name of a page is "security through obscurity", which should never be relied upon. Set your system up with that assumption that everything is visible, and work your security out from that

The only way to defend yourself from filename guessing, but still being able to provide that pages to some users is to leverage user accounts, logins and authenthications.
Other than that, you can set .htaccess to deny from all IP's, with some exceptions.

to do that you need to use some kind of router.
I highly recommend slim framewrok, but you can easily develop your one.
First restrict all the .html and .php files in htaccess except of the main index.php (or however you want to call it).
Then in index.php you check if user is allowed to see the file. If yes you include the file in outpout if not show 404.
In slim you can easily use ready made middlewares for authentications.
http://docs.slimframework.com/ to learn more about routing.
http://docs.slimframework.com/#Middleware-Overview to learn about middleware
https://github.com/tuupola/slim-basic-auth/ for simple authentication and examples.

Reading the comments and suggested answers has me thinking that a good solution is to have every file that needs security query the database to determine if the "authenticated" state still applies at this point in time.
Implement the "per request check" as described here wherever there is a vulnerability.
If STT LCU will convert his comment into a posted answer, I will delete this answer.

Related

Set up PDO connection without password

I'm trying to connect to my database, but I changed my database's root password in the interest of security. However, in order to connect to the database and use PDO, I apparently have to pass my password in the php, which obviously is not good for security:
$hsdbc = new PDO('mysql:dbname=hs database;host=127.0.0.1;charset=utf8', 'root','passwordgoeshere');
$hsdbc->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
$hsdbc->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
Am I being stupid and that because it's PHP no-one but the person who views the actual file will be able to see the password, or is there some way to do it without passing the password in the file.
Generally speaking it's not bad practice to have connection strings in files that are not user facing. If you don't want to have your personal password in the php file, then you can create a new mysql user for php.
You can also restrict the user's IP address in MySQL to the server hosting your php scripts. This way if a nefarious person browsing the web somehow was able to see the database password, they would have more difficulty accessing it.
People are not able to just go and read into your files. They should be safe on the place where you host it. They are only able to get into to files if they are able to get into the place when you host your stuff. Which should not be possible if they don't have the info to get there.(which should only be known to you).
This is not just for PDO. but also my mysql and mysqli to do it like this
Going to extend SupSon (SC2 Select fan?)'s answer:
PHP itself is server coded language.
There are only 3 ways (maybe more if someone want to add to it) that code can be shown to an outside user:
By having an unsecure .htaccess file that shows php file as text
file (then you should move servers at that point because normally
this doesn't happen)
Somehow your operating on debug mode and something in your page
triggers this mode and you get a whole bunch of PHP code gets shown
FTP/SSH access to your .php file (then you have more than a PDO
problem in your hands)
So if one of these cases is happening, coding into a .php file your username/password won't be a breach in security.
I have seen websites that expose PHP code, when the Apache type handler for PHP becomes unconfigured by accident. Then the code in .php files is displayed instead of executed. There's also an Apache type handler to display PHP source deliberately, though this is not usually configured.
To avoid this vulnerability, it's a good practice to put your sensitive PHP code outside your htdocs directory. Instead, put in your htdocs directory a minimal PHP script that loads the rest of the code using include() or require().
An alternative is to put your MySQL credentials in a config file instead of PHP code at all. For example, the file format used by /etc/my.cnf and $HOME/.my.cnf is readable by the PHP function parse_ini_file(). It's easy to store your MySQL password outside of your code this way.
For example, read user and password from the [mysql] or [client] sections of /etc/my.cnf:
$ini = parse_ini_file("/etc/my.cnf", true);
if (array_key_exists("mysql", $ini)) {
$connect_opts = array_merge($connect_opts, $ini["mysql"]);
} else if (array_key_exists("client", $ini)) {
$connect_opts = array_merge($connect_opts, $ini["client"]);
}
$pdo = new PDO($dsn, $connect_opts["user"], $connect_opts["password"]);
Yes it seems insecure at first, but once you get the hang of it and know how to manage your files to minimize potential security breaches, you can minimize the risks associated with having passwords stored in plain text in potentially publicly exposed spaces. Yet AFAIK PDO doesn't even let you form a connection without supplying a password. The solutions are a combination of what everyone has said and then some. Here's my quick guide for what I do.
Separate SQL users for separate purposes (minimizes damage from SQL injection or hacked accounts):
There should be a PHP-specific user for each table you need to access. That user will be granted only enough rights to handle as much of that table that he needs to, if it doesn't need to delete then don't grant it delete. If it doesn't need to select then don't grant it select. It seems fussy but very quickly you'll have a copy-paste template to make the users, give them the right(s) they need, and document it. If there's a joined table, you'll want to also grant the user access to that table also, naturally.
-- a single user account for a specific purpose:
CREATE USER 'usermanager'#'localhost' IDENTIFIED BY '5765FDJk545j4kl3';
-- You might not want to give access to all three here:
GRANT SELECT, UPDATE, INSERT ON db.users to 'usermanager'#'localhost';
The purpose of this is so that if you have a bug in your code that lets people SQL inject, they won't be able to cause any harm beyond the scope of what that role can do.
Stupid mistake can reveal PHP code and files if left in-directory, move them out:
Never mind revealing the source code, even just trying to access php files "out of order" can be destructive.
Move as many files to an out-of-scope directory as possible. Then call them like so:
require_once('../lib/sql_connectors.php');
This should escape your html / webdir and you should hopefully be able to store all sorts of fun stuff outside the scope of what a stupid admin mistake could reveal.
You can even have a php file that gets pictures and videos from outside your webdir, that's how streaming sites protect their resources and also conduct php-based authentication to file access. To learn how to do that you'll want to look up assigning your own etag headers to make sure browsers cache your php-retrieved files otherwise you'll have a very busy server, here's a short introduction.
Block wrongful access to PHP that are left in-directory:
All of your in-directory PHP files can be protected by checking that the $_SERVER['REQUEST_URI'] isn't itself. If it is, you can have a function called show404() that loads the 404.php page and dies there or just directly call your 404.php with an include. That way, even if you have hackers trying to brute force your php files they'll never see them because they'll get 404 errors (fools the bots) and they'll see the 404 page (fools the humans).
I avoid using .php in any publicly visible paths, to do that, I make rewrite rules in my .htaccess files that look like this:
RewriteEngine On
RewriteRule ^login$ login.php [L,QSA]
The L makes it stop running other rules.
The QSA preserves the $_GET tags.
The first lines of code for every file (consider prepending) could be:
// they should be connecting via a redirect, not directly:
$fileName = basename(__FILE__);
if ($_SERVER['REQUEST_URI'] === '/' . $fileName) {
error_log('Security Warning: [' . $_SERVER['REMOTE_ADDR'] . '] might be trying to scrape for PHP code. URI: [' . $_SERVER['REQUEST_URI'] . ']');
include('404.php'); // should point to your 404 ErrorDocument
exit();
}
// redirect to actual file
include('../hidden/php/' . $fileName);
In this example, assuming you have the redirect in your .htaccess, a login.php with the code above, and a login.php in your hidden directory, the user would experience the following two scenarios: attempt to connect to '/login' and see the hidden '/login.php' page; attempt to connect to the visible '/login.php' directly and get a 404 error.
Those are the 3 big things, lots of small limited accounts to minimize damage in case of security failure, keep all possible files outside the web directory, and make all in-directory php files produce an error letting only non-php links access them.

make folder secure in php

I am interested to know what is considered more secure out of
include $_SERVER['DOCUMENT_ROOT'] . '/config/special.php';
and
include '../config/special.php';
I see a lot of answers on Stack Overflow and elsewhere on the web that say to store data outside the document root. I have also seen a lot of tutorials that say to include a file in php do something like
include $_SERVER['DOCUMENT_ROOT'] . '/config/special.php';
Would it be ok to do the above if you change the permissions on the folder to something like drwxr-x-wx so you get forbidden if you are trying to access the path in the url (or is this still considered a risk) or would it be better to move my config above my root folder like the below.
include '../config/special.php';
The answer is: "how secure is the rest of your server"?
If you forbid access to the directory from Apache remote calls, you've solved half of the issue regardless of where you store your files. However, this isn't the silver bullet of security. If it was, people would know about it! (I proved this where I work a month ago - one of the lead devs thought that storing stuff in a non-webroot subfolder would make everything secure. Little did he know about the file explorer that he built, and more importantly, how it could take ../ as a parameter and work accordingly... 30s later, database got dumped - on a dev environment).
The key to security is to harden everything. If you allow users to read files, make sure to ban the .. and ... operators, and anything related to it, to prevent them from traversing up the directory chain. Prevent outside users from reading the file (.htaccess with Apache), and that should deter most people.
Depending on how you sanitize the up-one and up-two chains, however, you might end up with more issues. Take the example of my co-worker again: after the dump, he decided to operate these replacements:
.. => .
... => .
(Order is important)
Little did he know about me putting in .... . First regexp would turn this into .., second regexp would do nothing. Bingo, database got dumped. Again.
On a broader more succinct note if you are looking for a straight forward answer to your PHP question.
None of the above are or are not secure. Simply including a file is a normal and warranted action in any PHP application and security would not be effected by any of the above methods.
If you want to read more about security I suggest reading more about how to ensure your server / application is secure.
Perhaps this could be a nice read.
http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html

have different static url in dynamic page

I have a website where each person has his personal profile. I would like to have static URL like mywebsite/user1, mywebsite/user2, but actually I would remain in the same page and change the content dynamically. A reason is that when I open the site I ask to a database some data, and I don't want to ask it each time I change page.
I don't like url like mywebsite?user=1
Is there a solution?
Thank you
[EDIT better explenation]
I have a dynamic page that shows the user profile of my website. So the URL is something like http://mywebsite.me?user=2
but i would like to have a static link, like
http://mywebsite.me/user2name
Why I want this? Because it's easy to remember and write, and because i can change dynamically the content of the page, without asking each time data to my database (i need some shared info in all the pages. info are the same for all the pages)
Yes there are solutions to your problem!
The first solution is server dependend. I am a little unsure how this works on an IIS server but it's quiet simple in Apache. Apache can take directives from a file called .htaccess. The .htaccess file needs to be in the same folder as your active script to work. It also needs the directive AllowOverride All and the module mod_rewrite loaded in the main server configuration. If you have all this set up you need to edit your .htaccess file to contain the following
RewriteEngine on
RewriteRule ^mywebsite/([^/\.]+)/?$ index.php?user=$1 [L]
This will allow you to access mywebsite/index.php?user=12 with mywebsite/12.
A beginner guide to mod_rewrite.
You could also fake this with only PHP. It will not be as pretty as the previous example but it is doable. Also, take into concideration that you are working with user input so the data is to be concidered tainted. The user needs to access the script via mywebsite/index.php/user/12.
<?php
$request = $_SERVER['REQUEST_URI'];
$request = explode($request, '/'); // $request[0] will contain the name of the .php file
$user[$request[1]] = $request[2];
/* Do stuff with $user['user'] */
?>
These are the quickest way I know to acheive what you want.
First off, please familiarise yourself with the solution I have presented here: http://codeumbra.eu/how-to-make-a-blazing-fast-ajax-call-to-a-zend-framework-application
This does exactly what you propose: eliminates all the unnecessary database queries and executes only the one that's currently needed (in your case: fetch user data). If your application doesn't use Zend Framework, the principle remains the same regardless - you'll just have to open the database connection the way that is required by your application. Or just use PDO or whatever you're comfortable with.
Essentially, the method assumes you make an AJAX call to the site to fetch the data you want. It's easy in jQuery (example provided in the article mentioned above). You can replace the previous user's data with the requested one's using JavaScript as well on success (I hope you're familiar with AJAX; if not, please leave a comment and I will explain in more detail).
[EDIT]
Since you've explained in your edit that what you mean is URI rewriting, I can suggest implemensting a simple URI router. The basics behind how it works are described here: http://mingos.eu/2012/09/the-basics-of-uri-routing. You can make your router as complex or as simple as needed by your application.
The URL does not dictate whether or not you make a database call. Those are two separate issues. You typically set up your server so example.com/username is rewritten internally to example.com/user.php?id=username. You're still running PHP, the URL is just masking it. That's called pretty URLs, realized by URL rewriting.
If you want to avoid calling the database, cache your data. E.g. in the above user.php script, you generate a complete HTML page, then write it into a cache folder somewhere, then next time instead of generating the page again the script just outputs the contents of the already created page. Or you just cache the database data somewhere, but still generate the HTML anew every time.
You could write an actual HTML file to /username, so the web server will serve it directly without even bothering PHP. That's not typically what you want though, since it's hard to update/expire those files and you also typically want some dynamic content on there.
Select all from your database.
Then create file containing the scripts contents(index.php?user='s) for each one. set the file name to user_id/user_name you got from the SELECT statement.
This will create a page for each user in the present folder.
To avoid having to recreate 'static' pages, you could set a new column named say 'indexedyet' and change it to 1 on creating a file. You select only files which have this as 0. You could perform this via cronjob once a day or so.
This leaves you vulenderable to user data changes though, as they won't autmatically update. a tactic to use here is to update the static page on any editing.
Another, probably better (sorry not had enough coffee yet-) ideal would be to create a folder on a users registration. Make the index.php page tailored to them on registration and then anything like www.mysite.com/myuser will show their 'tailored version'. Again update the page on user updates.
I would be happy to provide examples depending on your approach.

Is there a way to pass variables except sessions and get variables?

My problem is not so easy to describe ... for me :-) so please be lenient towards me.
I have several ways to view a list. which means, there are some possibilities how to come to and create the view which displays my list. this wokrs well with parallel opend browser tabs and is desired though.
if I click on an item of my list I come to a detail-view of that item.
at this view I want to know from which type of list the link was "called". the first problem is, that the referrer will allways be the same and the second: I should not append a get variable to the url. (and it should not be a submitted form too)
if I store it to the session, I will overwrite my session param when working in a parallel tab as well.
what is the best way to still achive my goal, of knowing which mode the previous list was in.
You need to use something to differentiate one page from another, otherwise your server won't know what you're asking for.
You can POST your request: this will hide the URL parameters, but will hinder your back button functionality.
You can GET your request: this will make your URLs more "ugly" but you should be able to work around that by passing short, concise identifiers like www.example.com/listDetail?id=12
If you can set up mod_rewrite, then you can GET requests to a url like www.example.com/listDetails/12, and apache will rewrite the request behind the scenes to look more like www.example.com/listDetails?id=12 but the user will never see it -- they will just see the original, clean/friendly version.
You said you don't have access to the server configuration -- I assume this is because you are on a shared server? Most shared servers already have mod_rewrite installed. And while the apache vhost is typically the most appropriate place to put rewrite rules, they can also be put in a .htaccess file within any directory you want to control. (Sometimes the server configuration disables this, but usually on a shared host, it is enabled) Look into creating .htaccess files and how to use mod_rewrite

Using PHP/Apache to restrict access to static files (html, css, img, etc)

Lets say you have lots of html, css, js, img and etc files within a directory on your server. Normally, any user in internet-land could access those files by simply typing in the full URL like so: http://example.com/static-files/sub/index.html
Now, what if you only want authorized users to be able to load those files? For this example, lets say your users log in first from a URL like this: http://example.com/login.php
How would you allow the logged in user to view the index.html file (or any of the files under "static-files"), but restrict the file to everyone else?
I have come up with two possible solutions thus far:
Solution 1
Create the following .htaccess file under "static-files":
Options +FollowSymLinks
RewriteEngine on
RewriteRule ^(.*)$ ../authorize.php?file=$1 [NC]
And then in authorize.php...
if (isLoggedInUser()) readfile('static-files/'.$_REQUEST['file']);
else echo 'denied';
This authorize.php file is grossly over simplified, but you get the idea.
Solution 2
Create the following .htaccess file under "static-files":
Order Deny,Allow
Deny from all
Allow from 000.000.000.000
And then my login page could append that .htaccess file with an IP for each user that logs in. Obviously this would also need to have some kind of cleanup routine to purge out old or no longer used IPs.
I worry that my first solution could get pretty expensive on the server as the number of users and files they are accessing increases. I think my second solution would be much less expensive, but is also less secure due to IP spoofing and etc. I also worry that writing these IP addresses to the htaccess file could become a bottleneck of the application if there are many simultaneous users.
Which of these solutions sounds better, and why? Alternatively, can you think of a completely different solution that would be better than either of these?
I would consider using a PHP loader to handle authentication and then return the files you need. For example instead of doing <img src='picture.jpg' /> Do something like <img src='load_image.php?image=picture.jpg' />.
Your image loader can verify sessions, check credentials, etc. and then decide whether or not to return the requested file to the browser. This will allow you to store all of your secure files outside of the web accessible root so nobody is going to just WGET them or browse there 'accidentally'.
Just remember to return the right headers in PHP and do something like readfile() in php and that will return the file contents to the browser.
I have used this very setup on several large scale secure website and it works like a charm.
Edit: The system I am currently building uses this method to load Javascript, Images, and Video but CSS we aren't very worried with securing.
X-Sendfile
There's a module for Apache (and other HTTP servers) which lets you tell the HTTP server to serve the file you specify in a header in your php code:
So your php script should look like:
// 1) Check access rights code
// 2) If OK, tell Apache to serve the file
header("X-Sendfile: $filename");
2 possible problems:
You need access to rewrite rules (.htaccess enabled or direct access to config files)
You need mod_xsendfile module added to your Apache installed
Here's a good answer in another thread:
https://stackoverflow.com/a/3731639/2088061
I have been thinking a lot about the same issue. I am equally unhappy with the PHP engine running for every small resource that is served out. I asked a question in the same vein a few months ago here, though with a different focus.
But I just had an awfully interesting idea that might work.
Maintain a directory called /sessions somewhere on your web server.
Whenever a user logs in, create an empty text file with the session ID in /sessions. E.g. 123456
In your PHP app, serve out images like this: /sessions/123456/images/test.jpg
In your htaccess file, have two redirect commands.
One that translates /sessions/123456/images/test.jpg into /sessions/123456?filename=images/test.jpg
A second one that catches any calls to //sessions/(.*) and checks whether the specified file exists using the -f flag. If /sessions/123456 doesn't exist, it means the user has logged out or their session has expired. In that case, Apache sends a 403 or redirects to an error page - the resource is no longer available.
That way, we have quasi-session authentication in mod_rewrite doing just one "file exists" check!
I don't have enough routine to build the mod_rewrite statements on the fly, but they should be easy enough to write. (I'm looking hopefully in your direction #Gumbo :)
Notes and caveats:
Expired session files would have to be deleted quickly using a cron job, unless it's possible to check for a file's mtime in .htaccess (which may well be possible).
The image / resource is available to any client as long as the session exists, so no 100% protection. You could maybe work around this by adding the client IP into the equation (= the file name you create) and do an additional check for %{REMOTE_ADDR}. This is advanced .htaccess mastery but I'm quite sure it's doable.
Resource URLs are not static, and have to be retrieved every time on log-in, so no caching.
I'm very interested in feedback on this, any shortfalls or impossibilities I may have overlooked, and any successful implementations (I don't have the time right now to set up a test myself).
Create a rewrite map that verifies the user's credentials and either redirects them to the appropriate resource or to an "access denied" page.
Maintaining the contents of the htaccess files looks to be a nightmare. Also, your stated objective is to prevent non-authenticated users not non-authenticated client ip addresses from accessing this content - so the approach is not fit for purpose:
Multiple users can appear to come from the same IP address
A single users session may appear to come from multiple addresses.
I worry that my first solution could get pretty expensive on the server as the number of
users and files they are accessing increases
If you want to prevent your content from leaching and don't want to use HTTP authenitcation then wrapping all file access in an additional layer of logic is the only sensible choice. Also, you don't know that using PHP for this is a problem - have you tested it? I think you'd be surprised just how much throughput it could deliver, particularly if you use an opcode cache.
I'm guessing your 'simplification' of the wrapper addresses issues like mime type and caching.
C.
I wrote a dynamic web application and deployed it on Webshere Application Server, and here is the way how I secured my static files :
I first added
<login-config id="LoginConfig_1">
<auth-method>FORM</auth-method>
<realm-name>Form-Based Authentication</realm-name>
<form-login-config>
<form-login-page>/login.html</form-login-page>
<form-error-page>/login_error.html</form-error-page>
</form-login-config>
</login-config>
in web.xml which will tell your webserver to use form based authentication(code to use login is given below).
code to login page :
<form id="form1" name="form1" method="post" action="j_security_check" style="padding: 0px 0px 0px 12px;">
Username:
<label>
<input name="j_username" type="text" class="font2" />
</label>
<br />
<br />
Password:
<span class="font2" >
<label>
<input name="j_password" type="password" class="font2" />
</label>
</span>
<br />
<br />
<label>
<input type="submit" class="isc-login-button" name="Login" value="Login" />
</label>
</form></td>
To make form based login happen you have to configure your webserver to use a particular user registory which can be LDAP or database.
You can declare your secure resources and whenever a user tries to access those resources container automatically checks that whether user is authenticated or not. Even you can attach roles also with the secure resources. To do this I have added following code in my web.xml
<security-constraint>
<display-name>Authenticated</display-name>
<web-resource-collection>
<web-resource-name>/*</web-resource-name>
<url-pattern>/*</url-pattern>
<http-method>GET</http-method>
<http-method>PUT</http-method>
<http-method>HEAD</http-method>
<http-method>TRACE</http-method>
<http-method>POST</http-method>
<http-method>DELETE</http-method>
<http-method>OPTIONS</http-method>
</web-resource-collection>
<auth-constraint>
<description>Auth Roles</description>
<role-name>role1</role-name>
<role-name>role2</role-name>
</auth-constraint>
</security-constraint>
<security-role>
<role-name>role1</role-name>
</security-role>
<security-role>
<role-name>role2</role-name>
</security-role>
So this code will not let user see any static file( since /*) until he is logged in under role role1 and role2. So by this way you can protect your resources.
If you are using apache, you can configure, as below, in either .htaccess or httpd.conf file. Below is an example to prevent access to *.inc file. It greatly works for me.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
Please refer for more detail at: http://www.ducea.com/2006/07/21/apache-tips-tricks-deny-access-to-certain-file-types/.
Assume you want to protect all your static files and you have to server them from inside your webroot, you can protect all HTTP-Methods except HEAD.
If you are authorized, you make a head request pass through the headers and send the filecontent as body.
Sure this is expensive, but you are protected and you have the same behaviour.
I might have a suggestion based on iframe and HTTP_REFERER but its not bullet proof and it will depend on what exactly you want to protect with this access.
But in the case of preventing a full static page to be displayed without authentication you can do as follow:
1 - use a PHP page to authenticate the user
2 - redirect to another PHP page containing a key in the URL and an iframe linking to your static content in the body:
<iframe src="static/content.html" />
3 - Then in your htaccess you could check the key inside the HTTP_REFERER like that:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !AUTH_KEY
RewriteCond %{REQUEST_URI} ^/path/to/protected/page$
RewriteRule . - [F]
4 - Finally if you want to make it more dynamic and not use the same KEY every time you could use rewrite map as suggested by Ignacio Vazquez-Abrams' answer or create a file using the user IP as filename and check if file exists using REMOTE_ADDR then remove the file after a while.
But keep in mind that iframe + HTTP_REFERER behavior might vary from one browser session to another as well as REMOTE_ADDR so its limited...

Categories