So I have a fairly noobish question, I have been reading up a lot around the subject, but can't quite find the answer I want, so bear with me...
I have a fairly simple website that I have been designing, consisting of the following:
1) HTML and PHP files that I want the user to be able to access directly by typing in the url in the browser.
2) HTML files that are only to be viewed inside an iframe in 1) (don't ask me why I used iframes)
3)PHP files that are called on by 1), e.g. when form data is submitted. I want 2) and 3) to be accessible to 1), but not directly accessible to the user by typing in the url.
4) images and includes, etc.
5) maybe this is a different issue altogether, but I also have a MySQL database.
I understand that I can control access to files by putting them in private/public folders in the website directory? My question is how should my directory structure be and where should I put 1), 2), 3), etc.?
Thanks a lot for your help.
Your directory structure does not matter. Any URL that is accessible to some users is accessible to all users. You only have control over the content of that URL.
If you really need to limit access to the content loaded by 1) you have to use PHP to serve the content. That PHP script can check some parameters or login credentials or something that makes sure the URL has been loaded by 1).
However, it's hard to give you a clear answer since you don't describe the concrete problem you're having. For example, it makes much difference how secure the method needs to be. For example, it's rather simple to check if a URL is loaded inside a frame using JavaScript but that check is not hard to circumvent.
Your httpdocs directory is your Apache DocumentRoot (found in /etc/httpd/conf/httpd.conf) or your vhost DocumentRoot (if you've got vhosts defined), so assuming Linux:
1,2,4 should go into /var/www/vhosts/sitename.com/httpdocs/ - these are directly accessible through the browser.
3 should go into /var/www/vhosts/sitename.com/library/. When the user submits data it should hit a handler page (the user must be able to "see" this page) and that page includes the necessary files from this library directory. As long as your server is configured to run PHP for all *.php scripts this is probably unnecessary, as there is little advantage to be gained from hiding PHP scripts. If you don't want them invoked directly and you'd like to leave them in a publicly accessible area, try:
public script:
define('INVOKED_BY_SCRIPT', true);
...
include "../library/hiddenScript.php";
"hidden" script:
if (!defined('INVOKED_BY_SCRIPT') || true !== INVOKED_BY_SCRIPT) {
echo "Cannot invoke directly";
exit;
}
Your MySQL database should be in /var/lib/mysql, or wherever it's been installed by default. Be sure to run the MySQL security script mysql_secure_installation to remove default passwords and test databases.
Everything except the includes should be in your public_html directory.
For 3, it is not possible to have a PHP script that can be referenced by a form, but it not accessible by the user typing in the address into the address bar. The best you could do is to check the post variables to see whether anything has been posted. You could check the HTTP_REFERER variable, but I would not recommend this since it cannot be relied on.
You can't make HTML only to be viewed inside an iframe
There are NO files called by 1). It's users browser that calls your files.
So, just leave your directory structure as is, it's okay.
What you're probably looking for is a way to "hide" your executable files outside of the document root, which you can do with a directory structure something like this:
public_html <-- (document root)
index.php <-- (publicly accessed index file)
images
htmlstuff
private_index.php <-- (application's "real" index file)
application
tmp
Then, for public_html/index.php, you'd just have:
<?php
require_once('../private_index.php');
Related
I have an SPA that uses AJAX calls to assemble content from multiple PHP files. I can add the following into the main application's config file to be able to redirect users that are not logged in back to the login page as long as they tried going through the portal to look at stuff.
// Verify Login to access this resource
if($_SESSION["loggedIn"] != true) {
echo ('resource denied <script>window.location.href = "https://'.$_SERVER['SERVER_NAME'].'/login.php";</script> ');
exit();
}
The problem comes in that there are tons of views, models, controllers, and third party widgets that can still be accessed directly if you simply tried scanning the site for common file architures.
Is there a way to use something like an htaccess or php.ini file to automatically append this login check to all of the php files in a directory so that I don't have to paste this into each and every page?
Baring that, is there a way to set my chmod settings to only allow indirect access to those files such that php scripts running on the server can use them, but they can not be directly visited? Thanks.
[EDIT]
Moving files outside of my public folder did not work because it broke the AJAX.
I also tried auto_prepend_file in an htaccess file, but this resulted in a 500 error. I am using a VPS that apparently won't let me do an AllowOverride All in my Apache pre_virtualhost_global.conf; otherwise, I think that would have been the right way to do this.
Setting the CHMOD settings of my resource folders to 0750 appear to be allowing the AJAX commands to execute without allowing direct access to the files. If anyone knows of any other security caveats to be aware of when doing this let me know. Thanks.
I have been reading about where to securely save a PHP file that has my mysql database connection password. I understand from the forums that it should be saved in a folder above the webroot. I have a cloud server from a hosting company.I have access to root
The path to the public files is as follows:-
/var/www/vhosts/mydomain.co.uk/httpdocs/afile.php
Say I have a PHP file (containing my password) called sqlpassfile.php
Would the following be okay as a place to securely store it? ie in a new folder called Newfolder after vhosts??
/var/www/vhosts/NEWFOLDER/sqlpassfile.php
Sorry for a simple question but just want to make sure its secure
Thanks
All the nowadays PHP framework you will find do, indeed store their whole code base in a level under the web root.
They do not only store informations like credentials actually, they do store all the business logic of the application outside of the web root. They will then only allow a facade file to be accessed (most of the time a index.php or app.php) that will, then, with the help of controllers, handle every request and route you to the right page/content, and, of course, all the static content the site will use (your design images, your css, your js, ...).
For example :
Zend Framework does use a public folder where you will find an index.php and all the static files
Symfony does use a web folder where you will find two files app.php and app_dev.php and again all of the static files
So in your case you could do
/var/www/vhosts/example.com/httpdocs/ is the web root of your server
/var/www/vhosts/example.com/app/ store all the php code you need
/var/www/vhosts/example.com/app/config store all your configuration file, and then maybe your credentials files which you can call sql_config.php
/var/www/vhosts/example.com/httpdocs/afile.php will require_once '../app/config/sql_config.php
Usually, People just save the database connection information in a regular PHP file, for example, Wordpress saves the connection info in it's wp-config.php. Simply because nobody is able to see your password by visiting that php page, nothing is returned.
To make it more secure, you can disable access to php file while mod_php stopped working. Try this in you .htaccess
<IfModule !mod_php5.c>
<Files *.php>
Order Deny,Allow
Deny from all
</Files>
</IfModule>
Please also have a look at this post:
Password in file .php
Whether your method is safe depends on the configuration of the server, something that providers are not often very good at documenting.
Your first line of defence is keeping what is essentially confutation data inside a file named with a .php extension. So if it is accessible from a browser the webserver will execute the file rather than returning the data. You certainly want at least 2 levels of security on your data (each of which you have tested independently).
Considering the path you have chosen, /var/www/vhosts/NEWFOLDER/sqlpassfile.php what happens if you request http://NEWFOLDER/sqlpassfile.php from the server? (In most cases, nothing but once in while....) Generally its better practice to keep it well clear of the directories your webserver uses.
You should recognize the first tag as an opening php tag (if you don't you should probably learn php). What follows is a small check that makes sure that this file is being included by Kohana. It stops people from accessing files directly from the url.
http://kohanaframework.org/3.2/guide/kohana/tutorials/hello-world
Let's assume that your webservers DocumentRoot is /srv/www and you put your example code under /srv/www/application/classes/controller/hello.php.
"stop accessing files directly from URL" means that if a user now navigates to www.example.com/application/classes/controller/hello.php it will not run the script, instead it will display 'No Direct Script Access', since SYSPATH is not defined.
http://kohanaframework.org/3.2/guide/kohana/flow
This may be a really stupid question...I started worrying last night that there might be someway to view PHP files on a server via a browser or someother means on a client machine.
My worry is, I have an include file that contains the database username and password. If there were a way to put the address of this file in to a browser or some other system and see the code itself then it would be an issue for obvious reasons.
Is this a legitimate concern?
If so how do people go about preventing this?
Not if your server is configured right. I think discussion on how that is done belongs on serverfault.
To add on to the other answers:
If you use a file extension like .inc there's indeed a higher risk. Can you open the file directly in your browser?
The most important advice is missing:
Only the files that should be accessed by a browser, should be in a publicly accessible location. All the other code (and configuration) should be in a completely separate directory.
For example
root
- webroot
- includes
- config
Only 'webroot' is exposed by your webserver (apache). Webroot can for example contain a single index.php, along with all your assets (javascript, css, images).
Any code index.php needs to load comes from 'includes' and all the configuration from 'config'. There's no way a user could ever directly access anything from those 2 directories, provided this is done correctly.
This depends on the file extension you have given the include file.
If the extension is one that is known and executed by the web server, it will be protected. If you browse to the file, the server will try to execute the code rather than just returning it as plain text.
If the extension is not known by the web server it will serve it as plain data, so anyone (who can guess the file name) can browse to the file and see the source code.
A Directory Traversal Vulnerability can used to obtain files off of the remote mahine. Alternatively you can use MySQL based sql injection to read files using load_file(). You can also test your system with w3af's urlfuzzer which will look for "backup files", such as index.php.zip. Also make sure that all files have .php extensions, a .inc can be viewed from the public. I would also disable Apache directory listing.
Normally there should be no way to view the PHP files remotely... it would be absolutely pointless. This completely depends on what web server you are using and how it's setup though.
Having looked around I can see that it is possible to protect a directory via the .htaccess by adding these lines:
Order allow,deny
Deny from all
This apparently protects the directory so that only local non web-access is possible.
This allows me to keep my includes in a subdirectory of the main site directory which is good for organisation and it can be used on the projects where I do not have access to folders outside the web root.
Does anyone else use this method?
Just for good measure I've put the directory permissions to execute only.
And the include extension is PHP as suggested by others.
I would like to ensure that any scripts that are trying to "include" my database connection file are located under my own domain. I don't want a hacker to include the database connection file to their malicious script and gain access to my database that way. My connection file's name is pretty easy to guess, it's called "connect.php". So without renaming it and taking the security through obscurity route, how can I protect it by making sure all connection requests are made by scripts residing under my own domain name? How can this be checked using PHP?
Generally speaking if someone tries to include a file on your domain, they will see the results of the execution of that file. What do you see when you load the connect.php script in your web browser? Thats what they'll see as well if they try to include a remote file.
That said, its generally a good idea to keep important files inaccessible from the outside of your public web space. So, if your website is /var/www/yoursite/ then keep your connect.php in /some/dev/dir/yoursite and include the files from your pages using require_once '/some/dev/dir/yoursite/connect.php';
thetaiko's answer addresses the fundamental issues here - but if anyone else has access to run code on the server (i.e. its a shared server) then access to the file will depend on how the server is configured.
There are lots of ways that access might be constrained - e.g. suphp, base_opendir, multiple chrooted servers. The only way to find out what's going on for sure is to casr yourself in the role of the hacker and see if you can access files outside your designated area.
C.
What do you mean by including your connection file? If a script does include "connect.php" then they can see the source code of the file, so whatever security measures you add to that file will be pointless, as it will be like:
if($notFromHostname)
{
echo "DONT LOOK AT THIS";
die();
}
define('DB_PASS',"myPassword");
...
And the "hacker" will clearly be able to see your password. You are probably better off using something like iptables to deny hosts that are not from a specific domain.
Are you on a shared server and don't want other users of the same server instance to be able to get at your files? That'd be up to your server provider, then, to provide some sort of chroot or virtual system to keep your things in. For Apache, mod_suid can accomplish this nicely, and each vhost gets its own userid and permissions set.
If you want external users to not be able to get at your files, then unless you've badly munged your code, or the server's badly misconfigured, then all they'll get when they visit http://yourserver.com/connect.php is a blank page
No other user than yourself should have access to your PHP files in any way, as Felix mentioned. However, this is how you'd check in PHP:
if($_SERVER['SERVER_NAME'] != "example.com")
die("I've been kidnapped!");