.htaccess: prevent php scripts from accessing parent/sibling directories - php

I'm not particularly experienced with .htaccess (outside of simple mod_rewrite, and basic deny/access), and am unsure of how to approach the following issue:
I have a directory structure as follows:
/parentDirectory
/childDirectoryOne
/childDirectoryTwo
I have a domain that points to /parentDirectory (we'll call it parent.com), and seperate subdomains for each of the children directories (we'll call them one.parent.com and two.parent.com respectively).
These are all located on a shared host. I need to be able to grant ftp access to the subdirectories, but the problem is right now, someone could upload a php file to a childDirectoryOne that scans the parent directory, thereby discovering its sibling direcotry, and can then move into the sibling directory and get sensitive information from files (like a dbConfig file).
What I have been attempting to do (with no success so far) is develop a set of .htaccess files that would prevent the scripts in the children directory from accessing the parent or sibling directories. I'm not even sure if this is possible. Unfortunately, my shared host has no support for setting up a chroot jail, so this is my last option for finding a solution (next to purchasing hosting for each and every ftp user so they can't access others information).

It's considered bad practice to allow read, write and execute permissions to a folder to people you don't absolutely trust.
The ability to upload an arbitrary script and execute it on the server is a very big deal (them accessing another folder is the least of your worries). People can completely destroy your server and all sites on it, access your db, overwrite other pages in any site, and the list goes on.
I would recommend disabling php entirely for uploaded files. You can put this in your .htaccess.
php_flag engine off
That being said, if you really want to do it this way, you can use the open_basedir.
<Directory /parentDirectory/childDirectoryOne>
php_admin_value open_basedir "/parentDirectory/childDirectoryOne"
</Directory>

NOTE!
You need to utilize safe_mode too, otherwise with shell(),exec()... you will be hacked.... BUT!! that's not enough. Read here fully - https://puvox.software/blog/restrict-php-access-upper-directory/

Related

Prevent Apache/PHP from running code that affects another vHost

THE SITUATION
I have multiple folders in my /var/www/ directory.
Users are created that have control over a specific directory... /var/www/app1 belongs to app1:app1 (www-data is a member of the app1 group).
This works fine for what I want.
THE PROBLEM
If the app1 user uploads a PHP script that changes the file/folder permissions for something in app2s directory structure, the Apache process (as there's only one installed on the server) will be more than happy to run it, as it has the necessary permissions to access both /var/www/app1 and /var/www/app2 folders and files.
EDIT:
To the best of my knowledge, something like, /var/www/app1/includes/hack.php:
<?php
chmod("/var/www/app2", 777);
?>
The Apache process (owned by www-data) will run this, as it has permissions to change both /var/www/app1 and /var/www/app2 directories. The user app1 will then be able to cd /var/www/app2, rm -rf /var/www/app2, etc., which is obviously not good.
THE QUESTION
How can I avoid this cross-contamination of the Apache process? Can I instruct Apache to only run PHP scripts that affect the files/folders that reside within the relevant vHost root directory and below?
While open_basedir would help, there are several ways of bypassing this constraint. While you could break a lot of functionality in php to close off all the backdoors, a better solution would be to stop executing the php as a user whom has access to all the files. To do that, you need to use php-fpm with a separate process pool/uid/gid for each vhost.
You should still have a separate uid for the php execution from the uid owning the files with a common group allowing a default read only access to the files.
You also need to have separate storage directories for session data.
A more elaborate mechanism would be to use something like Apache traffic server in front of a container-per owner with each site running on its own instance of Apache - much better isolation, but technically demanding and somewhat more resource intensive.
Bear in mind, if you are using mariadb or similar, that the DBMS can also read and write arbitrary files (SELECT INTO OUTFILE.../LOAD DATA INFILE)
UPDATE
Rather than the effort of maintaining separate containers, better isolation could be achieved with less effort by setting the home directory of the php-fpm uid appX to the base directory of the vhost (which should contain, not be, the document_root - see below) and use apparmor to constrain access to the common files (e.g .so libs) and #{HOME}. Hence each /var/www/appX might contain:
.htaccess
.user.ini
data/ (writeable by fpm-appX)
html/ (the document root)
include/
sessions/ (writeable by fpm-appX)
You should add an open_basedir directive to each site's vhost file. The open_basedir directive limits the directories that a site can access.
You can read more about open_basedir here.

php access files outside of apache

I have a project where Red5 is recording videos. I need PHP to be able to access the videos and move them so they can be accessed by HTML.
How can I do this?
I found this post: Accessing files outside the document root with Apache
But it involves updating some file that was never specified. And I'm not sure it is a viable solution in this case anyway.
lee
PHP by default can already access files outside the web root, unless restricted with an open_basedir directive (or safe mode, but hope you're not in that cage).
It's normally a good practice to insert within a VirtualHost configuration an open_basedir restriction. You can specify multiple directories separated by : on Linux and ; on windows.
php_admin_value open_basedir /var/www/s/stage:/usr/share/php:/your/dir
To access those files either use an absolute path or a path relative to the position of the PHP file called. (So you'll have to ../ to reach levels above).
Also be sure that directories in which you want to write to are assigned to the webserver user and have write permission.

Filesystem permissions making a cURL-based caching script break

I'm writing this cURL script in PHP. It's purpose is to take a product or category code given to it (which type of code it is is ambiguous at that point, sure, it's possible for a category and a product to have the same code, but that's what business rules are for and what this question is NOT about), and then attempt to load either a product or category page on our shopping cart with it. Whichever page returns a 200 response then gets its output cached into an html file in the DocumentRoot.
Problem is, the DocumentRoot isn't owned by apache and I don't feel comfortable giving global write permissions to the DocumentRoot, so while the script works for the most part, the page doesn't get cached.
I do not have root or su access to the server and cannot get either. I tried writing the file to the /tmp/ directory and then moving it, but the permissions won't let me. Is there a way around this without opening up a security hole? If not, would this be possible with a Perl CGI script or would I face the same problem?
If apache doesn't have the rights do something, then there's nothing you can do to bypass it short of putting in an suid program to force a permissions set, use suphp to do the same, or just grant the required permissions.
Another option is to grant Apache write permissions in a SUBdirectory off the documentroot, and then use some mod_rewrite magic to make requests for those cached files get transparently rewritten to use the subdir instead. that way you've got a writeable directory, but don't have the issues of making the parent document root writeable.

folder to save files that are retrieved with require in document tree

I'm building a website based on php and i want to ask where to put files that are retrieved with a require statement, so that they can not be accessed from users with their browser.
(for example a php file that connects to my database)
EDIT actually i think the better way is to put them outside the public root because apache tutorial says htaccess will have a slowdown impact. it can be done with adding a ../
for example require("../myFile.php"); (At least this works in my server)
Best regards to all
That depends on the web server configuration. Usually (or at least in all cases I witnessed), you have a document root which cannot be accessed by users with their browser, with in there a folder containing all public material (often called htdocs, httpdocs, public_html or anything of the kind. Often, you can place your PHP include files in that root, and then require them using require("../include_file.php");
However, it depends on the configuration whether PHP can include files outside your public folder. If not, a .htaccess file is your best option.
If you place those files outside the document root of your webserver users cannot access these files with a browser.
If you use apache you can also place these files in a directory to which you do not allow access with a .htaccess file.
And as a last remark, if your files do not generate output, there is no way users can check the contents of the files.
If you mean source code then it is not visible for users, if you want hide folder contents use .htaccess directive Options -Indexes to hide files, if you can access php source your server configuration is wrong and it is not parsing php files.
You normally place them into a directory that is not accessible over the webserver (outside the document or web root). Sometimes called a "private" directory.
You then include/require the file from that path as PHP has still access to the files.
See also:
placing php script outside website root
disable access to included files - For a method if you're not able to place the files in a private directory.
Just make them secure with .htacces!
Here's a very clear tutorial for protecting files with a password. If you don't need direct access to the files per browser, or only your scripts need access, just block them completly by changing the code between
<Files xy>
change this bit here
</Files>
to
Order allow,deny
Deny from all
Then you won't need your htpassword file anymore either!
You need to put these files outside of public-facing folders on your web server. Most (all?) web hosts should have the capability to change the document root of the website.
For example, let's say that all of your files are served from the following directory on your host: /home/username/www/example.com/
This means that anything that resides inside that directory is visible to the internet. If you went to http://example.com/myfile.png it would serve the file at /home/username/www/example.com/myfile.png.
What you want to do is create a new directory called, for example, public which will serve your files, and point the document root there. After you've done that, the request for http://example.com/myfile.png will be served from /home/username/www/example.com/public/myfile.png (note the public directory here). Now, anything else that resides within the example.com directory won't be visible on your website. You can create a new directory called, for example, private where your sensitive include files will be stored.
So say you have two files: index.php, which serves your website, and sensitive.php which contains passwords and things of that nature. You would set those up like this:
/home/username/www/example.com/public/index.php
/home/username/www/example.com/private/sensitive.php
The index.php file is visible to the internet, but sensitive.php is not. To include sensitive.php, you just include the full file path:
require_once("/home/username/www/example/com/private/sensitive.php");
You can also set your application root (the root of your websites files, though not the root of the publicly accessible files) as a define, possibly in a config file somewhere, and use that, e.g.:
require_once(APP_ROOT . "sensitive.php");
If you can't change the document root, then what some frameworks do is use a define to note that the file shouldn't be executed directly. You create a define in any file you want as an entry point to your application, usually just index.php, like so:
if (!defined('SENSITIVE')) {
define('SENSITIVE', 'SENSITIVE');
}
Then, in any sensitive file, you check that it's been set, and exit if it hasn't, since that means the file is being executed directly, and not by your application:
if (!defined('SENSITIVE')) {
die("This file cannot be accessed directly.");
}
Also, make sure that your include files, when publicly accessible (and really, even if not), have a proper extension, such as .php, so that the web server knows to execute them as PHP files, rather than serving them as plain text. Some people use .inc to denote include files, but if the server doesn't recognize them as being handled by PHP, your code will be publicly visible to anyone who cares to look. That's not good! To prevent this, always name your files with a .php extension. If you want to use the .inc style to show your include files, consider using .inc.php instead.

Protect logfiles for php applications

I log sensitive information in a a log file lets call it "mylogfile.log". This file should in no circumstances be access from the outside/web.
I already protect it by using a .htaccess file but what i would like some extra safeguard like using a fileextension that is protected by the system. Is there any such?
The reason for the extra security is that this webapp is distrubuted to clients that could change or remove the .htaccess file. Also .htaccess override needs to be enabled in Apache.
You should put it outside of the document root.
If /var/www/your-site.com/public matches the URI your-site.com (public/index.html --> your-site.com/index.html etc), then log files will not be readable if you place them in /var/www/your-site.com/logs
When distributing an app like this, I would always make sure that you, given your limited and controlled space, do not use your "base folder" as the document root of the webserver, just to get some privacy around it.

Categories