Prevent public access to env file - php

I just found that Laravel 5 may output sensitive data and can lead to further exploitation of many hosts:
https://www.google.com/search?q=intext%3ADB_PASSWORD+ext%3Aenv&gws_rd=ssl
I want to know the way to secure my .env file. Can I use below code in .htaccess file to protect my .env file from browser view?
# Protect .env
<Files .env>
Order Allow,Deny
Deny from all
</Files>
Will my above code in .htaccess work and protect my .env file?

This isn't a vulnerability, and isn't even remotely an issue provided someone installs Laravel correctly - the webroot is the public folder, not the repository/project root.
The config files and .env file in laravel are not contained in the webroot, therefore you only need to ensure your webroot is path/to/project/public.
The google query you provided is literally just a bunch of people who didn't read the documentation before installing Laravel.

If you execute the
config:cache
command during your deployment process, you should be sure that you are only calling the env function from within your configuration files. Once the configuration has been cached, the .env file will not be loaded and all calls to the env function will return null.
so in the live server, you can delete .env file after you execute config:cache command

Hello you can create a
.htaccess
file at the same place and write the below code.
# Disable index view
Options -Indexes
# Hide a specific file
<Files .env>
Order allow,deny
Deny from all
</Files>

IMHO best way to protect a config file from browsing is to put it outside of the public dir. Protecting it via .htaccess could be deceptive, if something fails your file will become publicly available.

I'd like to point out your solution only helps on shielding the actual .env file. When enabling debug mode, while using the Whoops handler (and other error handlers possibly as well), the environment variables will also be shown to the visitor when an error occurs (this can even be a 404).
To sum up what others have said in this thread. An .env file is a security issue if:
You've installed laravel inside the publicly available directory, this can be public, www or public_html for instance. Make sure the folder public, contained in the laravel installation is the only folder made public through the webserver. Alternatively you can protect the .env file using .htaccess; but that's not an actual solution.
You've enabled debug mode and the error handler shows a debug mode with all environment variables. Disable debug mode or configure it so it will only be enabled for specific users or ips. This prevents sharing environment variables on debug pages.

This worked for me. Just go into the root folder (using SSH terminal).
cd ./youre_project
After chmod 700 .env
If you try to access .env file from public url you will got:
https://youre_profject.com/.env
Forbidden
You don't have permission to access this resource.

Related

How to secure .env file in laravel 5.4?

I am Working with laravel 5.4. And i have problem with .env and composer.json file. Anyone can access from any browser and anyone can see my database credentials so please help me to protect this files.
you can add following code to your .htaccess (make sure your .htaccess file should be in root folder not in public)file to deny the permission of .env file
<FilesMatch "^\.env">
Order allow,deny
Deny from all
</FilesMatch>
Simply you add below code to your .htaccess file to set permission of .env and composer.json file.
<Files .env>
Order allow,deny
Deny from all
</Files>
<Files composer.json>
Order allow,deny
Deny from all
</Files>
And below line for disabling directory browsing
Options All -Indexes
Remember that once your server is configured to see the public folder as the document root, no one can view the files that one level down that folder, which means that your .env file is already protected, as well your entire application. - That is the reason the public folder is there, security. - The only directories that you can see in your browser if you set the document root to the public folder is the folders that are there, like the styles and scripts.
You can make a test like this:
Enter in your project directory with the terminal and hit this:
php -t public -S 127.0.0.1:80
The -t means the document root, where the PHP built-in web server will interpreter as the document root. - see bellow:
-t <docroot> Specify document root <docroot> for built-in web server.
Now try to access the .env file, and you will see that you will get a 404 that the resource as not found.
Of course it's just an example, you will need to configure your sever to do the same.
Nobody can view these files via the browser because the root of your website is located at /public and the composer.json and .env files are outside of this scope.
The only way to view these files is actually connecting to the web server and going to the corresponding folder.
Make sure it is on your .gitignore and you create it locally on your server.

How can I make .htaccess files in a subfolder not execute?

I am creating a cloud storage project, and I want users to be able to upload any file. In particular, I want people to be able to upload .htaccess files, but I don't want Apache using these files as this is a security concern. How can I prevent Apache from using the user uploaded file, while still using my own .htaccess file in a parent folder?
This question is helpful reading. Directives near the www-root are applied first, subfolders are used later and may overwrite previous settings.
There are some things you can do:
Don't use .htaccess files at all, not even in other directories. If you have a dedicated server, you can edit the server config file, which is much more efficient. It will allow you to set AllowOverride None, which will prevent Apache from using .htaccess files at all. Instead, you can accomplish the same by putting your rules in the server config file. You'll need to restart Apache every time you make a change, and making an error in the server config file will prevent Apache from starting until it is fixed.
Store your files as random characters without an extension, make it impossible to access any files directly, and instead rely on a database to map a filename to a file. This allows you to store files securely, while not dumping everything in your database.
You cannot put anything in your .htaccess file that would prevent .htaccess files in subdirectories to be ignored, because AllowOverride only works in directory context, not in .htaccess context.

Website backup.zip File on Hosting Server is safe from hacker or Robot?

Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).

securing outward-facing website db configs

I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.
Storing the config outside of apache's document root is a must
You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.

Custom Log File with Correct Permissions

I have a processing file for my website's payments. It works just fine, but what I would like to do is log all the requests to this page so that if anything throws an error, the raw data is saved and I can process the transaction manually. The processing file uses fopen to write to the log file in another directory.
What I have right now is a separate folder on my root directory with permissions 755. Then a log file inside with permissions 777. The processing file that writes to the log file, in PHP if that matters, is set to 777.
This works right now, but the log file is publicly available. I know I can be doing this better and that the permissions aren't correct. How can I do this better?
Put the log file outside the document root. The PHP script that writes to it will still be able to get to it (via the full path) but Apache won't be able to serve it.
I came across this whilst searching the answer for myself. I don't believe there is a simple "permissions fix" to do what you want and perhaps the safest way is to put the log files outside of public_html directory.
However this can be a nuisance sometimes - especially if you are wanting to e.g. catch paypal ipn dump text in a log file, but not have it publicly accessible.
In such cases, you can use .htaccess file directives to allow write from script, but deny reading from public access.
For example, this works for me (Apache .htaccess in root public_html folder);
<FilesMatch "mycustom\.log">
Order allow,deny
Deny from all
</FilesMatch>
and if you have multiple logs you want to protect, use it like this, with "Pipe Separated";
<FilesMatch "mycustom\.log|ipn_errors\.log">
Order allow,deny
Deny from all
</FilesMatch>
It is worth noting that the above directives are deprecated as of apache 2.4 and you may wish to consider using more current directives instead: https://httpd.apache.org/docs/2.4/howto/access.html
Hope that helps you!

Categories