I know that my credentials could be stored in var/ directory but it doesn't feel right to me. How can I add an extra layer of security in order to keep the credentials safe?
Credentials should be stored in a file. This could either be a php file, or a .env file for example.
Credentials can be stored in plain text.
Your first line of defense is to store these files in a folder which is not accessible through your webserver.
For example:
/home/you/www/public -> public files go here, e.g your index.php
/home/you/www -> all other files, including vendor libs etc, and your config files
(The public folder should be defined as your serving folder in your virtual host configuration)
Your second line of defense is happening on your system. For example, maybe you have a special user with a strong password for your specific site. Basically, in this second line of defense you really specify who can access your files, who has SSH Access, etc.
Your third line of defense is configuring your database correctly. You might only need to allow local access for example?
Now, your fourth line of defense is making sure your credentials are never ever ever printed in error messages or logs, especially not in errors to web users. Differentiate between develop and production status in your scripts.
And there's more (monitoring incorrect logins, fail2ban, etc), but these are the basics. Most of them are handled when you use a php framework, like Laravel. Laravel differentiates between public and non public, uses .env files, allows you to define production status, and will limit error output to users.
Related
I'm working on a project using PHP (Laravel Framework). This project allows logged-in users to store files and folders that only they can access.
I've set up user authentication such that a unique user token is required to access the user's data.
When a user stores/creates a folder, the folder gets stored on the server, prepended with a random 32 character string complicated path such as /WijiIHSIUhoij99U8EJIOJI09DJOIJob/username/folder which is stored in a MySQL database for the user to fetch with an API so long as they have the correct access token for their account. I did this because I guess it seemed more secure, correct me if I'm wrong.
This is half the job (please correct me if what I've done so far is wrong or bad practice).
What I need now is to make sure nobody can access the root folder / and get a list of user folders.
But I fear if I restrict access to the root folder, users will no longer be able to access their folders.
I don't know where to even start looking to find a solution, so anything, even if it's just to get me started will help.
The random string is redundant, as your token does the randomization for you. If the users know that their stuff is stored in /RANDOMSTRING/folder, someone will build a bot to try them all anyhow, but the idea is close to what you want, I think.
Instead, try creating a script that catches all 404s, and checks if they match RANDOM_STRING/user/folder. If so, then you can have the random string in the database match to a universal folder, something like /user_files/username/their_folder. This way, you aren't exposing your folder structure, and you can add a middleware to count how many times they've unsuccessfully tried a URL, then ban their client if it's more than 3, for instance.
In this case, the script delivers the files (or the folder) as a download, or as a screen without the actual folder structure visible. The files can be stored anywhere, and the database correlates the URL entered with an actual file or folder somewhere. (really, we're trying to mitigate someone getting access through a brute force here).
From there, I would look into using a different service to store the files. This helps with two things that you can read up on. First, you can choose something like AWS cold storage which will cost you less, and second you can take the storage off the computational server, such that if there is a security breach, you know that they either only had access to the user files, or the code files. This will help you in the long run (although if they gain access to your code files, they will likely be able to locate your storage location and password... this is really to protect your code files).
Beyond that, start looking into read and write permissions and user groups. PHP, the OS, the users, and the admins should all have their own permissions on the storage server to make sure they can't do certain things with the files. Moreover, you don't want someone uploading a malicious file and having it execute.
This is just a primer, but it'll lead you to a LOT of information, I'm sure.
You have to make a distinction between the public and private files.
Public files
Public files can be accessed, you guessed it, by everyone. They are images, logos, documentations... This kind of files. If you have the url of these files, you can download them.
Private files
Things are a little more complicated for private files. These files should not, in any case, be downloadable without authentification. Said otherwise, only the user that own these files (and often the admins too) should be able to download them. For instance, these files could be an invoice for your order, a paid media...
For the directory structure, you are free, it's up to you. The only important thing is to prevent these files from being publicly accessible, so you MUST NOT put them somewhere in the public folder (and the linked storage/app/public).
By default, Laravel expects these files to be in storage/app. For instance, it could be storage/app/users/1451 for the user with id 1451. It's not important to randomize thes paths since these files are not accessible directly (they are NOT in the public path).
So you may ask, how a user can download them? The answer is to create an authentificated route and check for user authorization before sending the file back in the response.
You can do something like:
Route::get('invoices/{invoice}/download', [InvoiceController::class, 'download']);
Then, you apply a policy to this route, where you check that the authenticated user is able to download the file (like you would do for any other authenticated routes).
If he can, well, perfect, just return the file in the response: return response()->download($path_to_the_file);. If he can't, he will get a 403 Forbidden.
I'm confused about something and need some explanations.
In my practice I normaly see that 90% of PHP developers all sensitive informations like database connections, FTP, SMTP setup etc. store inside some variable, array, objects or constants.
I wonder now is better to use some ini file out of root and store there? Or is better to hide somewhere .ini file and deny access via .htaccess?
Generaly, I want to save that sensitive data on most secure way.
There is no perfectly safe choice, but some are better than others.
Don't save sensitive information in your project's source code -- you don't want your passwords and API keys on github.
Saving sensitive information in a database is fine, but then you still need somewhere to store the database credentials, and you're right back where you started.
You can save sensitive information in environment variables. These would usually be set up in your web server's configuration file(s).
Saving sensitive information in an ini file is fine, provided the following:
The file has the minimal permissions required.
The file is completely outside the web server's document root and thus can't ever be served directly. Don't put the file in your main directory and then use .htaccess to deny access to it.
The file is not committed to source control. If you're using git, edit your .gitignore so that the file is ignored.
These should also go without saying:
The user account running the web server process should never have write permission to the files it's serving.
Other non-privileged users on the machine running the web server process should not have read access to the files it's serving.
For me, I would suggest to store it in a dot file such .env (like what Laravel does), or environment variables, or INI file (as what you said above) as long as it is hidden from the world or if some hack your server, they won't be able to see it easily or they won't be able to access it.
I am using SQL in my app so i have created php site to manege it.
the problem is that the password required in mysqli_connect is the password to the main server (all the file maneger,mysql,etc).
what is the best way to secure my password (sending it from the app or wirte it down explicitly in the php source file).
Other suggestions?
There are many ways, but in general: put it in a file that is outside of the webroot, or a file that is otherwise unreadable from the outside. ie. a PHP file that gets executed will not expose it's source.
If you're using Apache you can store it in a file named .htsomething because Apache, by default, blocks access to any file starting with .ht*
You can store it in a file named secret.txt and block access to it by added an .htaccess RewriteRule.
For my projects I store settings in a JSON file that is outside of the webroot. One major advantage of this approach is that other applications, like a deploy or monitor tool, can read and easily generate this settings file. And it's also clean, you can't do any programming in JSON.
First up: Make a new account for your application with only the privileges required for your app!
Second: Put the in a file somewhere on your system locked down as best as possible. There are things you can do to obfuscate the password further, but in the end you will end up with a plaintext something which can be used to work your way back to the database credentials. So to be clear:
Use a database account with minimum privledges needed to run the app.
Manage security to the config file holding the password as best as possible (only the user running the service and admin should have access), keep it outside the webroot.
Just refreshed the other answer - the advice by #FritsVanCampen is also dead on.
I have a config.php file for my website that I store outside of the publicly accessible web directory and it contains my password information for a gmail account I use to send mail for the site and my database connection credentials. I don't like the idea of saving the password as a plaintext variable and was looking for some way to have this data more securely saved. Beyond blocking read access to the directory from users other than me, what can I do to secure this information?
You will end up saving in plain text most of the time. Say for example you want to encrypt, then the key to decrypt will have to saved in plain text etc etc. So better off making sure your server is secure.
Depends from what you are trying to protect.
Keeping your file outside the htdocs (webroot directory) is generally a good idea so that the file can't be called explicitly from outside. (I.E directing the browser to it).
If you want to protect the $var from within your code (i.e. third party malicious code) you can always unset the variables after they are consumed, although I don't know if that makes much difference.
If you want to protect the file from someone that might "hack" into your server, there isn't much you can do. You can always set the file permissions so that only www-data (your apache user) can read it but if someone gains root access to your machine, you are pretty much screwed.
Anyways, if your server is safe (no root access remotely or only through shh with public/private keys, you don't access your server from public PCs, etc...), you don't use third party code without inspecting it first and you store the pass file outside webroot directory, I think you're as safe as you can be.
We're making an app using PHP and using some third party services that require a secret API key.
We have a PHP file that contains all those keys definitions that we then import (using require_once) when needed.
Is this approach safe? Should we store the keys in a different place?
Thank you.
Something similar was asked today for a shell script. The answer is valid here as well: Make sure you store the file outside the web root, or (if that's not possible) protect it using a .htaccess file.
I also like to unset() any variables containing sensitive data after use, so not even a full variable dump (e.g. in a debug message) later in that script could reveal it.
It should be relatively safe as long as the file is not accessible from the web. A lot of sites will place sensitive files outside of the webroot on the server, and simply include them when needed into their app.
I always set the permissions of certificates and other files containing sensitive data such that only my development team and the apache service can access the file. This is important if you are using a server shared by a large organization, like a university, where lots of people might have permissions to the file by default. Often I've seen read permissions given to everyone so that the web server can access the file(since it is neither the owner nor in the group permission for the file, the only thing left is to give read to "other").
Instead, I ensure there is a group containing only my development team, and set the read/write permissions for the file to that group. I then use ACL to add a read permission for the APACHE service. You have to use an ACL since the owner and group are normally set to a developer and development team group, leaving you no options for setting access for apache other than using ACL.
Security by obfuscation:
create own namespace with function to encode, decode
add auto_prepend_file in php.ini to /usr/share/nginx/outsidehtml/keystorage/83738489384828838227.php
run nginx or apache in chroot,selinux
disable listing permission to keystorage folder setfacl -m user:nginx:x /usr/share/nginx/outsidehtml/keystorage/
add php.ini disable_class = ReflectionFunction disable_function = opcache_get_status,phpinfo,show_source,ini_get
For better harding you can store key in php.ini as value i.e 123key = secret64
TEST
print_r(glob('/usr/share/nginx/outsidehtml/keystorage/*.*'));