Protect logfiles for php applications - php

I log sensitive information in a a log file lets call it "mylogfile.log". This file should in no circumstances be access from the outside/web.
I already protect it by using a .htaccess file but what i would like some extra safeguard like using a fileextension that is protected by the system. Is there any such?
The reason for the extra security is that this webapp is distrubuted to clients that could change or remove the .htaccess file. Also .htaccess override needs to be enabled in Apache.

You should put it outside of the document root.
If /var/www/your-site.com/public matches the URI your-site.com (public/index.html --> your-site.com/index.html etc), then log files will not be readable if you place them in /var/www/your-site.com/logs
When distributing an app like this, I would always make sure that you, given your limited and controlled space, do not use your "base folder" as the document root of the webserver, just to get some privacy around it.

Related

Protect file in web root but give access from php

I have a situation where I want to protect a file from public access, but enable read and write from php. The file contains sensitive information like passwords.
The problem is that
I cannot put the file outside the web root (server security restriction on access from php)
I would like to avoid mysql database.
Also I would try to avoid .htacess files.
So if I make a folder, say private, in the web root, and do
chmod 700 private
Then, if the file to protect is private/data, I do
chmod 700 private/file
will this be a safe setup? So now I can read and write to the file from php but it is not accessible for the public?
Is this a safe setup?
PHP runs as the same user as the webserver so if PHP can read it, so can your webserver (and vice versa).
If you don't want to use .htaccess there is another trick: save the file as a .php file. Even if someone accesses the file from the web they can't see the source, they might just get a white page or maybe an error depending on what exactly is in the file.
If you're running suPHP or fastCGI php, you can use a setup similar to what you've described to limit access to files. Otherwise, PHP will use the same user as the web server, and any file PHP can access is also accessible via url.
If want to keep the restrictions stipulated (which are rather strange), and as (i guess) you do not wish/have access to apache config directives, consider adding PHP to some group and give the group only rights to the file, ie. apache cannot read (if its not in root/wheel).
Or make it a valid .php file (so only php would be invoker when the file is requested) which returns nothing or redirects when invoked with php. or just cipher it.

.htaccess: prevent php scripts from accessing parent/sibling directories

I'm not particularly experienced with .htaccess (outside of simple mod_rewrite, and basic deny/access), and am unsure of how to approach the following issue:
I have a directory structure as follows:
/parentDirectory
/childDirectoryOne
/childDirectoryTwo
I have a domain that points to /parentDirectory (we'll call it parent.com), and seperate subdomains for each of the children directories (we'll call them one.parent.com and two.parent.com respectively).
These are all located on a shared host. I need to be able to grant ftp access to the subdirectories, but the problem is right now, someone could upload a php file to a childDirectoryOne that scans the parent directory, thereby discovering its sibling direcotry, and can then move into the sibling directory and get sensitive information from files (like a dbConfig file).
What I have been attempting to do (with no success so far) is develop a set of .htaccess files that would prevent the scripts in the children directory from accessing the parent or sibling directories. I'm not even sure if this is possible. Unfortunately, my shared host has no support for setting up a chroot jail, so this is my last option for finding a solution (next to purchasing hosting for each and every ftp user so they can't access others information).
It's considered bad practice to allow read, write and execute permissions to a folder to people you don't absolutely trust.
The ability to upload an arbitrary script and execute it on the server is a very big deal (them accessing another folder is the least of your worries). People can completely destroy your server and all sites on it, access your db, overwrite other pages in any site, and the list goes on.
I would recommend disabling php entirely for uploaded files. You can put this in your .htaccess.
php_flag engine off
That being said, if you really want to do it this way, you can use the open_basedir.
<Directory /parentDirectory/childDirectoryOne>
php_admin_value open_basedir "/parentDirectory/childDirectoryOne"
</Directory>
NOTE!
You need to utilize safe_mode too, otherwise with shell(),exec()... you will be hacked.... BUT!! that's not enough. Read here fully - https://puvox.software/blog/restrict-php-access-upper-directory/

Where can I store logs in PHP app so that they cannot be accessed via HTTP?

Sorry if this is a trivial question.
I am a kind of new to PHP and I'm creating a project from scratch. I need to store my application logs (generated using log4php) as files, and I don't want them to be public.
They are now stored in a subfolder under my PHP application folder, (/myAppFolder/logs) so they are served by Apache.
Where shall I store them, or what shall I do to keep them away from being served as content by Apache?
You can either have them in a directory above the root, or, if you're on shared host/ can't have the files above the root for whatever reason, you can have them in a directory that denies all HTTP access.
So you could have a folder called "secret_files" with a .htaccess file sitting inside:
.htaccess:
deny from all
Which will prevent HTTP access to files/subfolders in that folder.
Somewhere not under the public root!?
This is more a server config question as it depends on your server, but in apache you could use the custom log directives to set the location, so if you have
/www/myapp
Create
/www/log
and put them there instead. You need control over the config to do this so look up your web hosts docs to find out how.

folder to save files that are retrieved with require in document tree

I'm building a website based on php and i want to ask where to put files that are retrieved with a require statement, so that they can not be accessed from users with their browser.
(for example a php file that connects to my database)
EDIT actually i think the better way is to put them outside the public root because apache tutorial says htaccess will have a slowdown impact. it can be done with adding a ../
for example require("../myFile.php"); (At least this works in my server)
Best regards to all
That depends on the web server configuration. Usually (or at least in all cases I witnessed), you have a document root which cannot be accessed by users with their browser, with in there a folder containing all public material (often called htdocs, httpdocs, public_html or anything of the kind. Often, you can place your PHP include files in that root, and then require them using require("../include_file.php");
However, it depends on the configuration whether PHP can include files outside your public folder. If not, a .htaccess file is your best option.
If you place those files outside the document root of your webserver users cannot access these files with a browser.
If you use apache you can also place these files in a directory to which you do not allow access with a .htaccess file.
And as a last remark, if your files do not generate output, there is no way users can check the contents of the files.
If you mean source code then it is not visible for users, if you want hide folder contents use .htaccess directive Options -Indexes to hide files, if you can access php source your server configuration is wrong and it is not parsing php files.
You normally place them into a directory that is not accessible over the webserver (outside the document or web root). Sometimes called a "private" directory.
You then include/require the file from that path as PHP has still access to the files.
See also:
placing php script outside website root
disable access to included files - For a method if you're not able to place the files in a private directory.
Just make them secure with .htacces!
Here's a very clear tutorial for protecting files with a password. If you don't need direct access to the files per browser, or only your scripts need access, just block them completly by changing the code between
<Files xy>
change this bit here
</Files>
to
Order allow,deny
Deny from all
Then you won't need your htpassword file anymore either!
You need to put these files outside of public-facing folders on your web server. Most (all?) web hosts should have the capability to change the document root of the website.
For example, let's say that all of your files are served from the following directory on your host: /home/username/www/example.com/
This means that anything that resides inside that directory is visible to the internet. If you went to http://example.com/myfile.png it would serve the file at /home/username/www/example.com/myfile.png.
What you want to do is create a new directory called, for example, public which will serve your files, and point the document root there. After you've done that, the request for http://example.com/myfile.png will be served from /home/username/www/example.com/public/myfile.png (note the public directory here). Now, anything else that resides within the example.com directory won't be visible on your website. You can create a new directory called, for example, private where your sensitive include files will be stored.
So say you have two files: index.php, which serves your website, and sensitive.php which contains passwords and things of that nature. You would set those up like this:
/home/username/www/example.com/public/index.php
/home/username/www/example.com/private/sensitive.php
The index.php file is visible to the internet, but sensitive.php is not. To include sensitive.php, you just include the full file path:
require_once("/home/username/www/example/com/private/sensitive.php");
You can also set your application root (the root of your websites files, though not the root of the publicly accessible files) as a define, possibly in a config file somewhere, and use that, e.g.:
require_once(APP_ROOT . "sensitive.php");
If you can't change the document root, then what some frameworks do is use a define to note that the file shouldn't be executed directly. You create a define in any file you want as an entry point to your application, usually just index.php, like so:
if (!defined('SENSITIVE')) {
define('SENSITIVE', 'SENSITIVE');
}
Then, in any sensitive file, you check that it's been set, and exit if it hasn't, since that means the file is being executed directly, and not by your application:
if (!defined('SENSITIVE')) {
die("This file cannot be accessed directly.");
}
Also, make sure that your include files, when publicly accessible (and really, even if not), have a proper extension, such as .php, so that the web server knows to execute them as PHP files, rather than serving them as plain text. Some people use .inc to denote include files, but if the server doesn't recognize them as being handled by PHP, your code will be publicly visible to anyone who cares to look. That's not good! To prevent this, always name your files with a .php extension. If you want to use the .inc style to show your include files, consider using .inc.php instead.

PHP include file extensions?

For required/included files in PHP, is it better to use .inc extensions vs .inc.php vs .php extensions?
Sometimes people use the .inc extension and then do some server configuration to keep .inc files from being accessed via a web browser. This might be good, if done absolutely correctly by a knowledgeable sysadmin, but there's a better way: Any file that's not supposed to be accessed by web users should be kept outside your document root. Once these files are off the web, so to speak, you can use whatever extension you want. .php is definitely a sensible choice for syntax highlighting, general sanity, and so on.
Apache can sometimes (due to bugs or severe crashes) serve .php files as text (happend to me a few times on shared hosting).... I think you can use any extension you want as long as you don't store your files in a public folder.
Let's say your site is in /home/user/public_html/
create another folder /home/user/lib_php/
have the files:
(1) .../lib_php/one.class.php with
class one {
//...
}
(2) .../lib_php/two.function.php with
function two() {
//...
}
and you have the main index.php in /public_html
<?php
include_once('../lib_php/one.class.php');
include_once('../lib_php/two.function.php');
$x=a;
$b=two($x);
$c=new one;
//etc..
or
<?php
require_once('/home/user/lib_php/the.file.php');
This way you are taking every precaution the files are not reachable directly but can be used by your scripts...
My personal preference is that anything in the document root is a .php file, to indicate it's directly executable by the web server, and anything that's a library is a .inc file stored in a parallel directory, to indicate it's NOT directly executable.
My standard configuration is
/home/sites/example.com/html/ - anything here is 'safe' to expose if PHP fails and serves up raw code
/home/sites/example.com/inc/ - libraries, config files with passwords (e.g. the database connection class with DB credentials), etc.. Anything that shouldn't be exposed as there's no reason for it.
While you can certainly configure Apache to deny access to .inc files and keep them inside the webroot, then you're depending on Apache to keep you safe. If PHP can fail within Apache and expose your code, then the .inc blocks can ALSO fail and expose your code's innards as well.
Of course, if Apache's coughing blood all over the floor, there's no reason that the directory traversal protection can't fail as well and let someone do http://example.com/../inc/seekritpasswords.txt.
At some point you just have to accept that if something's stored anywhere on the web server, there's a possibility that a failure may allow access to the raw data and expose everything. How much time and effort you want to expend on protecting against that is up to you.

Categories