Where to store sensitive information in a Drupal Module? - php

In a module I'm creating I have some sensitive information I need to store securely: A remote database host, username, and password.
It seems that the only storage available is in the Drupal database, which worries me since this means if Drupal is compromised so is this other database. The settings.php file in sites/all/default was my second option, but I'm having trouble writing to it. Various chmod commands in FTP and SSH to 777 and 666 won't open the file to writing. I'm also not sure if the variables I set there are available anywhere else.
Are there any other ways to store this information securely?

You're on the right track using settings.php. You can use the $conf variable in settings.php to set variables that you can access in modules using variable_get.

Hmmm... this seems like something you shouldn't do in general. Write an API that sits at the remote database that you can access.
If however you insist on direct database access. Hard code the host, username and password in a file, put the file outside your document root and include it from there. For example, if your document root (i.e. where Drupal's index.php file is) was /www/htdocs, put a file containing the info at something like /www/secure and include it where you need it. Then if php stops working for some reason, the file isn't in a readable location to the outside world but PHP can include it within the site as necessary.
Sure somebody might see that you were including the file but they wouldn't be able to see the file itself unless they hacked your server (rather than just Drupal) and in that situation, your pretty much screwed anyway.

Using a config file is ideal for this type of information. However doing a chmod 777 or 666 is a really bad idea. The problem is that both of these settings allow the file GLOBALLY read/write. So if you are on a shared host, then its possible for another user on the system to access your file. On install trying using php's chmod() function to do a chmod 500 on the file. (500 should work in most cases, the most important part is that the last number is zero).

Related

Looking for a secure method to hold private files for a given user

I'm looking for a secure way to allow users to upload files (e.g. PDF) allowing for future access (by that user) while denying access to anyone else.
The user is authenticated using a standard account-creation/login process and their credentials are held in session (using Linux/Apache/MySql/Php).
Questions:
Where should the files be held?
1) I could create a directory for each user (upon account creation) and make the directory name a salted hash. Would that be secure way to do it?
OR
2) Should I put the uploaded files in a location on the server outside the webroot and move the files to a temp location for display to the user? (then destroy that file and that temp location after the user is done with it).
(assuming choice #1 above) I would plan to create an .htaccess file for each directory with the following:
order deny,allow
deny from all
Would that be sufficient security for the given directory or is more needed?
Platform:
* Shared server using LAMP stack (PHP 7.0+)
Update:
I found a good discussion regarding this issue:
Arguments for and against putting files outside of webroot
Seems the argument against using .htaccess to protect webroot files is:
update (cont)
so far the only argument I found against using .htaccess is:
Imagine your server defaults for a virtual host are: no PHP, no
.htaccess, allow from all (hardly unusual in a production
environment). If your configuration is somehow reset during a routine
operation – like, say, a panel update – everything will revert to its
default state, and you're exposed.
you didn't directly state it, but since you're mentioning .htaccess, i assume the protocol you want to do file transfers over is http, using php and apache?
Where should the files be held?
have a folder dedicated to user-folders, make sure that folder is outside of the web-root, and give each individual user their own dedicated folder inside the user-folders folder.
1) I could create a directory for each user (upon account creation) and make the directory name a salted hash. Would that be secure way to do it?
you said The user is authenticated using a standard account-creation/login process and their credentials are held in session., i assume each user has their own dedicated id? usually a SQL PRIMARY KEY unique id? if yes, that id is already guaranteed to be unique, there is no need to have the directory name be a salted hash, just have the directory name be the user id, that will make everything easier (implementation, debugging, maintenance, disk-usage-logistics)
and.. have a php script that does file indexing and file transferring, and when people request to download a file, make sure to have strict file validation, you need to validate that the file requested is indeed located inside a folder that the requestee actually has permission to access the requested file, otherwise you'll get hackers requesting other people's files, or hackers requesting /users-folder/user_1387/../../../../etc/passwd or similar (you can use realpath() + strpos() for such validation).
.and because Apache is MUCH better at file transfer than php is, once a file has been requested for download AND has been validated, you should use X-Sendfile or similar to actually do the file transfer, apache will be able to transfer the file much more efficiently than php.. (and if for whatever reason you can't use X-Sendfile, then check the php passthru() function, but it gets more complex if you intend to support streaming/content-range/stuff like that, which is difficult and inefficient to do in PHP but easy for apache)

Protect file in web root but give access from php

I have a situation where I want to protect a file from public access, but enable read and write from php. The file contains sensitive information like passwords.
The problem is that
I cannot put the file outside the web root (server security restriction on access from php)
I would like to avoid mysql database.
Also I would try to avoid .htacess files.
So if I make a folder, say private, in the web root, and do
chmod 700 private
Then, if the file to protect is private/data, I do
chmod 700 private/file
will this be a safe setup? So now I can read and write to the file from php but it is not accessible for the public?
Is this a safe setup?
PHP runs as the same user as the webserver so if PHP can read it, so can your webserver (and vice versa).
If you don't want to use .htaccess there is another trick: save the file as a .php file. Even if someone accesses the file from the web they can't see the source, they might just get a white page or maybe an error depending on what exactly is in the file.
If you're running suPHP or fastCGI php, you can use a setup similar to what you've described to limit access to files. Otherwise, PHP will use the same user as the web server, and any file PHP can access is also accessible via url.
If want to keep the restrictions stipulated (which are rather strange), and as (i guess) you do not wish/have access to apache config directives, consider adding PHP to some group and give the group only rights to the file, ie. apache cannot read (if its not in root/wheel).
Or make it a valid .php file (so only php would be invoker when the file is requested) which returns nothing or redirects when invoked with php. or just cipher it.

What is the best way of reading files outside document root?

I was reading some posts about how to include files outside php root (Apache root). I guess only reading a file is a easier task, may be done with the same solution. But I do not intent to put php or any other script file outside my document root (right now is /Library/WebServer/Documents/), I wish to keep only one root with usual configurations.
But any file outside root is not "visible", it's like my all HD were made just by the root. Php did not return permissions error, it returns file don't exists (or is not a directory error). Is a good security practice, but make my scripts blind! I have a small Intranet an one task I wish to do is to read Safari's favorites file (Bookmarks.plist), also I wish to make a photo viewer, etc.
So I just want to read those files. Is there some hack for this?
EDIT: I was using file_get_contents. Following suggestions, I tried include that stops in permissions issues (probably owner issue). I did a test with a simple file in a Volume (an external HD) and it included just fine. However, I'm thinking in how to deal with data, you know, I was expecting to read the XML to work on it...
EDIT 2: file_get_contents is working with a file in an external HD, so the problem seams to be about permissions/owner file. The info window shows the same users for both files: me, staff and everyone with at least read permission. Maybe there is some "hidden" user... any hacker around?

Can I place PHP config files securely in a publicly accessible folder?

GoDaddy does not a give FTP root access to my account, meaning I can only access the public_html folder and not the includes folder.
Is there any way I can include the config files in that public folder but somehow make it so only the server can access them in a secure way? How does Wordpress do it?
You could use a .htaccess file to restrict Website Access.
Take a look of this article.
just make sure they have a .php extension.
(and actually contain PHP code of course)
Wordpress keeps the config file in the main folder. Just make sure you have a .php extension and you dont echo anything from that. (I know you wont.)
People really cant get the details inside your php file unless you echo something, or the chmod of the file is set wrong so that people may be able to actually download the file.
As xdazz said, you can also restrict access to your config files, but I think its just for MORE protection, and you are still safe without that.

Hiding PHP Files Outside WWW for Security

I've got a "globabVars.php" doc in my own little framework that contains database connection vars etc... I'm thinking would be neat to store outside of the web facing directories to keep it a little more secure. But, then I was thinking, is it really THAT much more secure? I mean, if someone were able to look at my .php files as a whole (without the server processing them) they would be INSIDE my server looking at all my files anyway...
Thoughts?
Moving a config file outside of the web root can prevent this file from getting leaked if you accidentally mis-configure apache. For instance if you remove Apache's mod_php then all .php files will be treated as text files. I have seen config files moved outside of the web root on production systems for this reason, and it did stop the file from getting leaked! (An admin iced the config during an update, doah!). Although this doesn't happen very often.
If an attacker can control the path of one of these functions: file_get_contents(), fopen(), readfile() or fgets() then he can read any file on your system. You also have to worry about sql injection. For instance this query under MySQL can be used to read files: select load_file("/etc/passwd").
To mitigate this issue, remove FILE privileges from your MySQL user account that PHP uses. Also do a chmod 500 -R /path/to/web/root, The last 2 zeros keeps any other account from accessing the files. You should also follow it up with a chown www-data -R /path/to/web/root where www-data is the user account that php is executed as, you can figure this out by doing a <?php system('whoami');?>.
It means noone can access it via a URL by default.
You can hide with .htaccess if it is in your docroot, but storing it above the docroot is just that bit safer.
You can have it read via PHP if your application is prone to directory traversal attacks.
Yeah, you are right. There is a very small difference.

Categories