Preventing a file being downloaded or accessed on a server - php

I want to be able to prevent people from accessing a file on a server, such as a document if they were to directly link to it via the URL. This is for security purposes so that documents on the site just can't be stumbled upon and downloaded...
What is the best approach for this?
I've tried using the .htaccess to deny access to docs and txts for examples, but you can still download the files it just prevents you from accessing the directory...which isn't what I want to do.
<Files ~ "\.(doc|txt)$">
order allow,deny
deny from all
</Files>

put it in a directory outside the public space and provide it via a custom PHP page which requires login or what you prefer
echo file_get_contents(/var/www/example.com/file.txt);
should works I guess

Try putting this in your .htaccess
<FilesMatch "\.(doc|txt)">
Order deny,allow
Deny from all
</FilesMatch>

The best thing to do is to not put it in the web server's document root in the first place.

You can in your .htaccess redirect all requests to files in that folder to a special PHP page that only allows logged in users to download the file but denies those unauthorized to access it.
Also it's a good idea putting the target file itself in a folder above public_html.

Related

Laravel Deny Direct Access to Files Via URL, but allow download after logged in?

I am using Laravel 5.3, apache and php 5.6, I need to deny direct access to files via url, but i need to download those files after the user Logged in via url or general php request to download a file, i ain't using Auth, i am using my own procedure for logging in. First is it possible to do so?
I have kept my files in
project/public/uploads/{clientid}/sample.txt
I know its a bad practice to store files in public folder instead of storing it in storage folder, but the project was developed initially in such a way and we need to overcome the direct access to files, is there anyway to do it, i have tried some by editing my .htaccess inside the public folder,
<IfModule mod_rewrite.c>
RewriteEngine on
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
<Files /../index.php>
Order Allow,Deny
Allow from all
</Files>
<FilesMatch ".*\.(css|js)$">
Order Allow,Deny
Allow from all
</FilesMatch>
</IfModule>
Anyhow this only allow css and js files inside public folder, but deny all other files that is kept inside the public folder via url and php general post download.
Laravel has better solution
you save your files to storage
and You can do it for download
return response()->download($file);
that your file is absolute pass of file in storage
you can manage them who can download and who cant and its secure
So after you do it you create one route like this:
Route::get('download/{file}',function($file){
return response()->download(storage_path($file));
});
you can do anything with route like add middleware for auth user or ....

How to deny access to users for a file but not to the plugin which uses the file using htaccess?

I have a plugin which is using an xml file located in the plugin folder.
example.com/wp-content/plugins/myplugin/myxml.xml
I want to deny access to the file for users but not to the plugin. If I type the URL I can read the file. I used the following in htaccess inside my plugin's folder
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
</Files>
I get the 403 error but the plugin cannot read the file
I used Options -Indexes as well
How can I fix this?
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
Allow from localhost
</Files>
This will only work if you place it in the main .htaccess. Then the file is not accessible from outside but accessible from the wordpress
The recommended solution for this issue is, Set proper file permission and user group. So all the application can access the file, but Public Users can't.
For more information visit Linux File permission
There are a couple of ways to go about this:
Load the file from the filesystem and not over the network if possible.
Use access control as #Jamie_D has suggested.
His code might not work if example.com doesn't resolve to localhost (check your /etc/hosts). It the file has to be accessed over the public internet, use your public IP.
For reference, here is the documentation for mod_access.
Access can be controlled based on the client hostname, IP address, or
other characteristics of the client request, as captured in
environment variables.
And you could also use authentication for that file.

prevent direct URL access to php and html pages via .htaccess

I know direct URL access can be prevented via .htaccess rules but I don't know much about meta-chars or escaping.
these are the files in my xammp/htdocs folder, accessed using
localhost:
index.php
createScheme.php
several someother.php
I want direct access to be enabled only for index.php and createScheme.php and all others pages should be blocked against direct access.
Please help!
This .htaccess file will only allow users to open index.php. Attempts to access any other files will result in a 403-error.
Order deny,allow
Deny from all
<Files "index.php">
Allow from all
</Files>
If you also want to use authentication for some of the files, you may simply add the content from your current file at the end of my example.

How to block direct requests to php files in specified directory?

I have a master folder where I hold all of my resources such as js, css, images, etc. called /resources/.
In /resources/, I have php files I include on pages of my website, however, they're currently directly accessible if entered into the browser.
Is there a way for me to use .htaccess in that directory to prevent direct requests to any files of certain extensions such as PHP?
Hotlinking prevention isn't really what I'm looking for. I just need a way to kill any sort of direct request to these PHP files(in the /resources/ directory specifically) made by anything other than the website itself.
Any help is appreciated. Thank you very much.
Add a .htaccess in your /resources/ folder containing the following
<Files ~ "\.php$">
Order allow,deny
Deny from all
</Files>
It should prevent access to all .php files

Protect PHP from Invalid Referrer via .HTACCESS?

I'm in a situation wherein I have file includes but I don't want other people going on the "includes" directory and viewing the pages individually via browser.
I'm quite familiar with how to approach it via inside the PHP files themselves but I want to use the .htaccess for preventing it this time.
So how do I configure .htaccess to prevent users NOT coming from a certain referrer from viewing the PHP files inside the "includes" folder?
.htaccess will work, but just to suggest an alternative - why not move your include directory outside the webroot? Your scripts can still access it, and there's no need to configure apache to deny access.
Put a .htaccess file in the directory you would like to not be viewed and put this in there:
order allow, deny
deny from all
This is the simple block all approach. More info on how to block by referer can be found here.
Hope this helps.
As lot of web hosting solutions explicitly limit you to working within the a public_html (or equiv) hierarchy. So I use a simple convention: if I don't want a file or directory to be private -- that is not accessible through a URI -- then I prefix its name with either a "_" or a ".", for example my PHP includes directory is called "_includes".
I use this pattern in my .htaccess files to enforce this:
SetEnvIf Request_URI "(^_|/_|^\.|/\.)" forbidden
<Files *>
Order allow,deny
Allow from all
Deny from env=forbidden
</Files>
You can use this approach, but modify the regexp to whatever suits your personal convention. One advantage is that it works with this template in your DOCROOT .htaccess file. You don't need to have .htaccess files in the restricted subdirectories.
:-)

Categories