Prevent URL access to files on website - php

I am creating a website with my own CMS. My problem is that I can access certain files via a URL in the browser.
I have tried to block it via .htaccess but when I do that, it also stops my functions from working, because they are blocked.
Does anyone know a solution for my problem?

Are your functions in a server side script or a client side script?
If they're server side, you can block HTTP access to the files by putting them in a directory that doesn't need to be accessed through HTTP and then putting a deny from all directive in that directory's htaccess file.
If they're client side, then you can't block access to them and still have the scripts work. The browser is executing the script, and it needs to access those files. You can do hacky things like refusing to serve the file unless a certain referrer URL is present, but I advise against doing that because it can cause problems with usability, caching, and search engines.

Add line at the end of .htaccess
Options All -Indexes
or Use
<FilesMatch "*\.(css|js|png)$">
Order Allow,Deny
Allow from all
</FilesMatch>

Related

Block all files from being accessed directly via htaccess

A security audit highlighted an issue where some files could be accessed directly eg www.domain.com/readme.txt
No biggy as there is nothing on the server that contains anything sensitive although this isn't going to cut it on the audit and we need to patch this.
Is there a way to block ALL files from being accessed directly unless though the website or specified (file extension)? Hoping it can be achieved via htaccess?
order allow,deny
<Files ~ ".(php|html|js|css)$">
allow from all
</Files>(Php, html, js and css are allowed to access)

Allow access to file ONLY FROM specific PHP file/ dir on same server

I've been pulling my hair off trying to find a solution for denying access to a file for all AJAX/ PHP/ in-browser requests EXCEPT when they come from one (other) directory/ file on the same server.
I have a .htaccess in the directory of the JSON file. The file is fetched by a CMS plugin as follows (pretty standard):
Javascript makes an AJAX call to a PHP file on the server, called handler.php.
handler.php then retrieves the contents of the file and returns it to JS.
I can't use rewrite rules, as it is not a prereq. for the CMS. I also can't set a known domain name, as it is dynamic. Ideally I would do Allow from SELF.
I have tried the following .htaccess configs without luck:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath>
Or:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath/fileName.ext>
I tried with setEnv directive too, and a mixture of others involving using <Files>, <Directory> and <Location>. If I just Deny from All the PHP cannot access it. Is there any solution for restricting access to a JSON file?
Also, does it have anything to do with the fact that I am testing on localhost?
There are 2 types of access to the file.
From web, where .htaccess rules or server (apache) config is applied.
By app or script running on the server machine, where filesystem permissions are applied.
So you can deny all requests in .htaccess file, but PHP script still can access the file via FS (dependent on FS permissions).

Problems Blocking Access to Single File with .htaccess

So, all I'm attempting to do with this .htaccess file is prevent anybody that isn't the server from being able to view the file e-mails.txt. The server needs access to it for a php script(using fopen). Everything I've read says this should work, but this is preventing any file in the directory, and subdirectories from what I can tell, from being accessible.
<Files e-mails.txt>
Order deny, allow
Deny from all
</Files>
Also, before when .htaccess was similar, it wasn't blocking the entire directory, but it was preventing the .php script to function properly, which is what caused me to delete it, which fixed the .php script but let e-mails.txt be visible to everyone. So then, when I re-created it and used the above code, the entire site/directory is spitting out a 500 error.
May be, you can write a rewrite condition for this file to 404 or 500 error page. This method render impossible access over http.

How can I block direct access to my JavaScript files?

I use Minify to minify and cache all my script requests. I only want my users to be able to access the minified versions of the JavaScript files.
Minify lies at www.example.com/min and my scripts are at www.example.com/scripts. How can I block direct access to doc_root/scripts which is where my unminified JavaScript files lie. I'd rather not put them out of the document root but it's an option.
Please note that I'm using Zend Framework, so the actual root of my application is shifted to www.example.com/public. An htaccess file handles the rewrite.
Can't you just use an .htaccess file inside doc_root/scripts to prevent all access over the web to .js files over HTTP?
It won't stop minify, since that provides indirect access.
So in doc_root/scripts/.htaccess, something along the lines of
<Files ~ "\.js$">
order allow,deny
deny from all
</Files>
Note that the location of the .htaccess file matters in this case.
You effectively can't block end-user facing code. Even if you served it with PHP or another server-side language and blocked direct requests, it's of course still possible to read it directly with a number of tools.
You should code with this in mind and be mindful with javascript comments, business knowledge, etc.
UPDATE:
However, if you're talking about code that doesn't ever need to be accessed by an end-user, you could as you mentioned move it out of the server root, or you can block the files in your directory (or an entire directory). It's easy with Apache's .htaccess.
order deny, allow
deny from all
You could also redirect the source files to the minified versions with mod_rewrite in your .htaccess file.
RewriteEngine On
RewriteRule /scripts/(.*)$ /min/$1 [L,NC]
Depends on the server you're using. Assuming it's Apache, you can add this to your .htaccess file:
<Directory ~ "\scripts">
Order allow,deny
Deny from all
</Directory>
Or something to that effect..
The only way is to check referers, and not everyone sends them, or sends a real one. In other words, you can't block direct access to anyone who really wants something. It's impossible to determine with 100% accuracy if a request is a direct one or is being done via a <script src=....> type request.
For your Javascript to actually run the user's browser must be able to read it ultimately.
As such there's no real way to "block" access to your scripts folder (well to be precise you can but that would break your website since the browser would not see the files in order to run them.)
One solution could be obfuscation, which makes the javascript code harder to read / understand but ultimately the user will see the code, and with a bit of persevering reverse engineering it can be de-obfuscated.
Another thing i've seen someone do is creating an "empty" js.html page, and insert all their javascript into script tags in the page (embedded, not external), and from his main page make ann ajax request to js.html and embed it at the bottom of the page. kind of a round about way but the user will not see the js when viewing the source unless using developper tools such as firebug.
Note that the last option also might cause some delay depending on the abount of code you are loading. but here the key is not blocking access to your scripts, but just making them harder to obtain / read / copy.
Edit: oops, misread as well. I think the best solution in this case would be to go with an htaccess file in your scripts folder denying all access
This answer is little bit newer, than question (only several years, that’s nothing)
You cannot deny access to JavaScript file, because they wont’t be accessible from <script> tag.
But I found a workaround:
RewriteEngine On
RewriteRule ^.*\.js$ /invalid.html [R=301,L]
Place it in your .htaccess file in your home folder of web. (under htdocs or public_html).
This will automatically redirect everyone from it. So they don’t see it.

how to protect php file with .htaccess from downloading with php5 crashed

Last night I made some admin changes to my webserver. I use php. The php processor failed after the update and if someone went to my homepage, the php page would simply download and show the proprietary code and password to anyone visiting. So I was wondering if there is a way to prevent any form of download for php files using .htaccess -- but still allow for normal viewing of the files.
A good pattern to follow during development is to use a minimal initialization file, which invokes the actual application which resides outside the webroot. That way only a minimal stub with no critical information is exposed in a case like this.
Simplified example:
/
/app
critical_code.php
/webroot
.htaccess <- rewrites all requests to index.php
index.php <- invokes ../app/critical_code.php (or other files as requested)
The trouble here is that either .htaccess is serving your files to the user or it's not. You can't tell it to deny access to the .php files, because then access will be denied during normal use, as well. There is no fallback behavior for the PHP processor simply not running correctly.
Maybe it's worth temporarily moving the web root to point to an "under maintenance" site when doing big things like that, to minimize risk as much as possible.
Assuming you're using Apache, your .htaccess file would look something like this.
<FilesMatch ".*\.php">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
<IfModule php5_module>
<FilesMatch ".*\.php">
Allow from all
Satisfy All
</FilesMatch>
</IfModule>
The first rule denies access to all .php files. By default, the user will see a 403 (Forbidden) error.
If the PHP5 module successfully loads, the second rule will take affect, which grants access.

Categories