I am Working with laravel 5.4. And i have problem with .env and composer.json file. Anyone can access from any browser and anyone can see my database credentials so please help me to protect this files.
you can add following code to your .htaccess (make sure your .htaccess file should be in root folder not in public)file to deny the permission of .env file
<FilesMatch "^\.env">
Order allow,deny
Deny from all
</FilesMatch>
Simply you add below code to your .htaccess file to set permission of .env and composer.json file.
<Files .env>
Order allow,deny
Deny from all
</Files>
<Files composer.json>
Order allow,deny
Deny from all
</Files>
And below line for disabling directory browsing
Options All -Indexes
Remember that once your server is configured to see the public folder as the document root, no one can view the files that one level down that folder, which means that your .env file is already protected, as well your entire application. - That is the reason the public folder is there, security. - The only directories that you can see in your browser if you set the document root to the public folder is the folders that are there, like the styles and scripts.
You can make a test like this:
Enter in your project directory with the terminal and hit this:
php -t public -S 127.0.0.1:80
The -t means the document root, where the PHP built-in web server will interpreter as the document root. - see bellow:
-t <docroot> Specify document root <docroot> for built-in web server.
Now try to access the .env file, and you will see that you will get a 404 that the resource as not found.
Of course it's just an example, you will need to configure your sever to do the same.
Nobody can view these files via the browser because the root of your website is located at /public and the composer.json and .env files are outside of this scope.
The only way to view these files is actually connecting to the web server and going to the corresponding folder.
Make sure it is on your .gitignore and you create it locally on your server.
Related
I’m trying to implement iOS Universal Links, I need to serve an apple-app-association file at the root of my WordPress.
How could I serve my apple-app-association file with Content-type: "application/pkcs7-mime" in WordPress?
I tried to directly upload it, but of course it didn't work because I need to modify the Content-type of the apple-app-association to: Content-type: "application/pkcs7-mime"
Since the apple-app-site-association file is not a WordPress file, you have to configure the content type at the server level. This is different depending on environment (Apache vs. nginx, for example). This can be hard, if your host doesn't allow access to low level configuration.
Apache configuration
Modify the /etc/apache2/sites-available/default-ssl (or equivalent) file to include the snippet:
<Directory /path/to/root/directory/>
...
<Files apple-app-site-association>
Header set Content-type "application/pkcs7-mime"
</Files>
</Directory>
nginx configuration
Modify the /etc/nginx/sites-available/ssl.example.com (or equivalent) file to include the location /apple-app-assocation snippet:
server {
...
location /apple-app-site-association {
default_type application/pkcs7-mime;
}
}
Source: https://gist.github.com/anhar/6d50c023f442fb2437e1#modifying-the-content-type
In theory I believe it is possible to do the Apache configuration via a .htaccess file, but I've never tried.
You may prefer to look into a free hosted deep link service like Branch (full disclosure: I'm in the Branch team) or Firebase Dynamic Links to handle all of this for you.
In case anyone is in the same situation I was where my website is hosted on Bitnami WordPress (e.g. through AWS), your root directory path is /opt/bitnami/apps/wordpress/htdocs. Once you've copied your association file there, the place to make the configuration change for the content type header described in Alex's answer is /opt/bitnami/apps/wordpress/conf/httpd-app.conf. Finally, you'll need to restart Apache for the configuration change to kick in, using the command sudo apachectl -k graceful. You can verify that your setup is correct using this validator tool.
See my post here for more details.
The easiest way to have the apple-app-site-association file delivered with content type application/json or application/pkcs7-mime in Apache is to add an .htaccess file in the same directory with the following contents:
<Files apple-app-site-association>
ForceType application/json
</Files>
or
<Files apple-app-site-association>
ForceType application/pkcs7-mime
</Files>
Then you don't have to add it to your server configuration.
Credit goes to http://redquark.com/wp/?p=209
For AWS / lightsail
you can simply connect via ssh extension from vscode -
(just configure ssh config with pem file - should look like this)
Host bitnami-wordpress
HostName 111.1.1.1 (your external ec2 ip address)
User bitnami
IdentityFile /Users/USERNAME/your-ec2-pem-file.pem
Now just open up the /opt/bitnami/wordpress folder
Just drag and drop the apple-app-site-association file into this directory
Open .htaccess / add this at bottom.
<Files apple-app-site-association>
Header set Content-type "application/pkcs7-mime"
</Files>
TROUBLESHOOTING
if you get a permissions problem saving file - you can save it as a different name. eg. 2.htaccess
open a new terminal -
then remove old file (you'll likely need to change permissions to appropriate user. For lightsail - bitnami it's sudo chown -R bitnami:daemon .htaccess
rm .htaccess
mv 2.htaccess .htaccess
(WARNING ONLY FOR AWS) sudo chown -R bitnami:daemon .htaccess
sudo apachectl -k graceful
I was able to upload the file after adding the following to wp-config.php file:
define('ALLOW UNFILTERED UPLOADS', true);
Now when I have the file uploaded I again removed the line for security reasons and I can further update the content of apple-app-site-association file through FTP.
I am using Laravel for web app. Uploaded everything on production and found out that some of the files can be directly accessed by url - for example http://example.com/composer.json
How to avoid that direct access?
You're using wrong web server configuration. Point your web server to a public directory and restart it.
For Apache you can use these directives:
DocumentRoot "/path_to_laravel_project/public"
<Directory "/path_to_laravel_project/public">
For nginx, you should change this line:
root /path_to_laravel_project/public;
After doing that, all Laravel files will not be accessible from browser anymore.
That is incorrect. composer.json sits outside of the public directory and therefore should not be accessible. This means that your VirtualHost configuration is incorrect.
Please make sure that your path to your directory ends with /public.
Point your web server to a public directory and restart it.
For Apache you can use these directives:
DocumentRoot "/path_to_laravel_project/public"
<Directory "/path_to_laravel_project/public">
Also You Can Deny files in .htaccess too.
<Files "composer.json">
Order Allow,Deny
Deny from all
</Files>
for multiple files you can add above files tag multiple times in .htaccess files.
You Can Deny files in .htaccess too.
<Files "composer.json">
Order Allow,Deny
Deny from all
</Files>
Point the web server to the public directory in the project's root folder
project root folder/public
but if you don't have the public folder and you are already pointing to the root folder, you can deny access by writing the following code in .htaccess file.
<Files ".env">
Order Allow,Deny
Deny from all
Allow from 127.0.0.1
</Files>
in the above code, first we are denying from all and allowing only from the own server (localhost to the server) to get executed, and hence we can protect it from outside users.
Set Your document root as public directory, so other files will not be accessible directly. Look for it in Your apache/nginx/??? configuration files.
It depends on the webserver your running. With Apache it would be .htaccess files whereas with Nginx it would be handled in the server configuration file.
With Apache, you can create .htaccess file in the root directory of Laravel project to rewrite all requests to public/ directory.
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteRule ^(.*)$ public/$1 [L]
</IfModule>
simply create blank
index.php
file in config directory , and write message in file as you like to inform acccessor user
ex. access forbindon by server
I am using Laravel 5.1
I recently uploaded my project in shared hosting. but when i browse http://siteAddress.com/local/.env my .env file is visible.
Is there any way to hide this file or redirect people if they want browse the site with folder view?
Finally I hide .env and disable index view of the folder named local. I create a .htaccess in folder local.
And here is the code of .htaccess
# Disable index view
Options -Indexes
# Hide a specific file
<Files .env>
Order allow,deny
Deny from all
</Files>
Please create a .htaccess file where you have .env file and write the code as shown below:
# STRONG HTACCESS PROTECTION
<Files ~ "^.*\.([Ee][Nn][Vv])">
order allow,deny
deny from all
satisfy all
</Files>
Then try to hit the .env file from url and it will not be available and show codes inside.
If you want to remove it from github.
Please create new file .gitignore on the same directory.
and add line
.env
You can add below code in .htaccess file to disable directory listing and restrict access of .env file:
# Disable Directory listing
Options -Indexes
# block files which needs to be hidden, specify .example extension of the file
<Files ~ "\.(env|json|config.js|md|gitignore|gitattributes|lock)$">
Order allow,deny
Deny from all
</Files>
The .env file resides outside the public folder so it should not be visible from outside world if the server is configured to see the public folder as document root.
From the best answer:
Remember that once your server is configured to see the public folder
as the document root, no one can view the files that one level down
that folder, which means that your .env file is already protected, as
well your entire application. - That is the reason the public folder
is there, security. - The only directories that you can see in your
browser if you set the document root to the public folder is the
folders that are there, like the styles and scripts.
https://laracasts.com/discuss/channels/general-discussion/how-do-you-protect-env-file-from-public
Check the folder structure on your hosting and make sure the public folder is the document root.
I just found that Laravel 5 may output sensitive data and can lead to further exploitation of many hosts:
https://www.google.com/search?q=intext%3ADB_PASSWORD+ext%3Aenv&gws_rd=ssl
I want to know the way to secure my .env file. Can I use below code in .htaccess file to protect my .env file from browser view?
# Protect .env
<Files .env>
Order Allow,Deny
Deny from all
</Files>
Will my above code in .htaccess work and protect my .env file?
This isn't a vulnerability, and isn't even remotely an issue provided someone installs Laravel correctly - the webroot is the public folder, not the repository/project root.
The config files and .env file in laravel are not contained in the webroot, therefore you only need to ensure your webroot is path/to/project/public.
The google query you provided is literally just a bunch of people who didn't read the documentation before installing Laravel.
If you execute the
config:cache
command during your deployment process, you should be sure that you are only calling the env function from within your configuration files. Once the configuration has been cached, the .env file will not be loaded and all calls to the env function will return null.
so in the live server, you can delete .env file after you execute config:cache command
Hello you can create a
.htaccess
file at the same place and write the below code.
# Disable index view
Options -Indexes
# Hide a specific file
<Files .env>
Order allow,deny
Deny from all
</Files>
IMHO best way to protect a config file from browsing is to put it outside of the public dir. Protecting it via .htaccess could be deceptive, if something fails your file will become publicly available.
I'd like to point out your solution only helps on shielding the actual .env file. When enabling debug mode, while using the Whoops handler (and other error handlers possibly as well), the environment variables will also be shown to the visitor when an error occurs (this can even be a 404).
To sum up what others have said in this thread. An .env file is a security issue if:
You've installed laravel inside the publicly available directory, this can be public, www or public_html for instance. Make sure the folder public, contained in the laravel installation is the only folder made public through the webserver. Alternatively you can protect the .env file using .htaccess; but that's not an actual solution.
You've enabled debug mode and the error handler shows a debug mode with all environment variables. Disable debug mode or configure it so it will only be enabled for specific users or ips. This prevents sharing environment variables on debug pages.
This worked for me. Just go into the root folder (using SSH terminal).
cd ./youre_project
After chmod 700 .env
If you try to access .env file from public url you will got:
https://youre_profject.com/.env
Forbidden
You don't have permission to access this resource.
I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>