How can I block direct access to my JavaScript files? - php

I use Minify to minify and cache all my script requests. I only want my users to be able to access the minified versions of the JavaScript files.
Minify lies at www.example.com/min and my scripts are at www.example.com/scripts. How can I block direct access to doc_root/scripts which is where my unminified JavaScript files lie. I'd rather not put them out of the document root but it's an option.
Please note that I'm using Zend Framework, so the actual root of my application is shifted to www.example.com/public. An htaccess file handles the rewrite.

Can't you just use an .htaccess file inside doc_root/scripts to prevent all access over the web to .js files over HTTP?
It won't stop minify, since that provides indirect access.
So in doc_root/scripts/.htaccess, something along the lines of
<Files ~ "\.js$">
order allow,deny
deny from all
</Files>
Note that the location of the .htaccess file matters in this case.

You effectively can't block end-user facing code. Even if you served it with PHP or another server-side language and blocked direct requests, it's of course still possible to read it directly with a number of tools.
You should code with this in mind and be mindful with javascript comments, business knowledge, etc.
UPDATE:
However, if you're talking about code that doesn't ever need to be accessed by an end-user, you could as you mentioned move it out of the server root, or you can block the files in your directory (or an entire directory). It's easy with Apache's .htaccess.
order deny, allow
deny from all
You could also redirect the source files to the minified versions with mod_rewrite in your .htaccess file.
RewriteEngine On
RewriteRule /scripts/(.*)$ /min/$1 [L,NC]

Depends on the server you're using. Assuming it's Apache, you can add this to your .htaccess file:
<Directory ~ "\scripts">
Order allow,deny
Deny from all
</Directory>
Or something to that effect..

The only way is to check referers, and not everyone sends them, or sends a real one. In other words, you can't block direct access to anyone who really wants something. It's impossible to determine with 100% accuracy if a request is a direct one or is being done via a <script src=....> type request.

For your Javascript to actually run the user's browser must be able to read it ultimately.
As such there's no real way to "block" access to your scripts folder (well to be precise you can but that would break your website since the browser would not see the files in order to run them.)
One solution could be obfuscation, which makes the javascript code harder to read / understand but ultimately the user will see the code, and with a bit of persevering reverse engineering it can be de-obfuscated.
Another thing i've seen someone do is creating an "empty" js.html page, and insert all their javascript into script tags in the page (embedded, not external), and from his main page make ann ajax request to js.html and embed it at the bottom of the page. kind of a round about way but the user will not see the js when viewing the source unless using developper tools such as firebug.
Note that the last option also might cause some delay depending on the abount of code you are loading. but here the key is not blocking access to your scripts, but just making them harder to obtain / read / copy.
Edit: oops, misread as well. I think the best solution in this case would be to go with an htaccess file in your scripts folder denying all access

This answer is little bit newer, than question (only several years, that’s nothing)
You cannot deny access to JavaScript file, because they wont’t be accessible from <script> tag.
But I found a workaround:
RewriteEngine On
RewriteRule ^.*\.js$ /invalid.html [R=301,L]
Place it in your .htaccess file in your home folder of web. (under htdocs or public_html).
This will automatically redirect everyone from it. So they don’t see it.

Related

htaccess securing files from being accessed by users

So, I'm sure this question gets asked quite a lot around here, but after a good 20 mins searching, I haven't been able to find any questions that produce the correct result.
I have files, such as /index.php and /dashboard.php which use resource files such as /framework/assets/stylesheets/index.css etc. What I want to do, is block off the access to any files in the /framework/ directory, but still allow usage of them from index.php and the other respective files. How would I configure my .htaccess file to allow me to do this?
I understand that this may not be possible, but is there any way that the directory /framework/ which includes some PHP files to be hidden from users, but still allowed to be accessed via other PHP scripts using include 'file.php'?
Any help would be very appreciated. Thanks.
Since you want to have a css file only available when the specific php script is run you could use a include directive in php and embed this into your other html.
Something like this:
//...
<head>
<style>
<?php
include("/framework/assets/stylesheets/index.css");
?>
</style>
//...
</head>
//...
php-files aren't restricted through htaccess and you include it server side and offer only the stuff you want to the client.
You should't block access to CSS and JavaScript files. If you do so, it means that your site's design is going to break. For include files, try below rules. Place your .htaccess file with below rules in your includes directory you want to forbid access to. These rules allow only $_POST requests on files contained in the directory as well as you can include one file to another restricting the direct access to that include file.
<LimitExcept POST>
order deny,allow
deny from all
</LimitExcept>
As stated in the comments to the original question by Magnus Eriksson, to achieve the effect that I wanted to with CSS files, JS files, and other resource files is something that cannot be done, as these files are directly fetched by the HTML, and therefore use http requests.
The PHP files however are able to be protected, by placing a .htaccess file in the /framework/ directory with the content of:
deny from all
To grant access to the resource's in /framework/assets/, All that needed to be done was to add another .htaccess file with allow from all written inside.
Simple answer to a simple question.
Documentation for further use:
https://httpd.apache.org/docs/2.2/howto/access.html

Prevent URL access to files on website

I am creating a website with my own CMS. My problem is that I can access certain files via a URL in the browser.
I have tried to block it via .htaccess but when I do that, it also stops my functions from working, because they are blocked.
Does anyone know a solution for my problem?
Are your functions in a server side script or a client side script?
If they're server side, you can block HTTP access to the files by putting them in a directory that doesn't need to be accessed through HTTP and then putting a deny from all directive in that directory's htaccess file.
If they're client side, then you can't block access to them and still have the scripts work. The browser is executing the script, and it needs to access those files. You can do hacky things like refusing to serve the file unless a certain referrer URL is present, but I advise against doing that because it can cause problems with usability, caching, and search engines.
Add line at the end of .htaccess
Options All -Indexes
or Use
<FilesMatch "*\.(css|js|png)$">
Order Allow,Deny
Allow from all
</FilesMatch>

Using .htaccess to block directory breaks css and images

Apologies if this has been asked already. I tried to do an extensive search but I don't know much about htaccess so I don't know which questions have been relevant.
Right now I'm setting up a pretty expansive system with php that requires several pages and functions. To keep things simple and manageable, I have one single file "economy.php" that then requires files from the "/economy/" directory.
I read on another question that the best way to deal with the files within only being accessed from the economy.php file is to use an htaccess file in /economy with deny from all. This worked, except now the images and stylesheet within the /economy directory don't work.
The solution I can think of is to create a directory /economy/pages/ and throw the php files and htaccess file in there. But that's sloppy, and I'm assuming there's an easier way to handle it.
What's my best course of action?
I think you should look at the Files apache directive
So you should have something like that :
<Files ~ "\.(php|.htaccess|php5)$">
deny from all
</Files>
Hope this helps ...
Mimiz
The "deny from all" is blocking the user's browser from accessing the CSS and image files. Best bet is to break up the directory structure. Create top level directories for images, css, js, source, etc. Then update the paths in the code accordingly. You can then use an Apache config (or .htaccess which is slower) to deny outside access to the source dir.

How to protect subfolder which contain php page

I have made a subdomain for my web-site which I will use to store all the important scripts, both php and javascript. I want to protect evertyhing there so that it cannot be accessed from a web-browse.
I have tried .htpasswd. But when the page be called to do the function, there are password require every times.
You can say that the folder can be protected, but it makes the script not work because access requires a password.
Are there better alternatives?
Put the PHP files outside of the web-root, and have the server access/include/require them via the file-path. My own private scripts reside in the 'private' folder, which is in the same directory as the /var/www/ directory, and are accessed via: include '../private/script.php'
This won't work for JavaScript, though (except for possibly server0side JavaSCript) as it needs to be accessed by the user/client before it can be used. If it can't be accessed it can't be used, which makes it somewhat pointless. To ensure security for JS don't put anything private into the JavaScript, it's the only way; and then sanitise any input taken from that JavaScript.
You can always use .htaccess's deny from all.
See this article on .htaccess to learn more
I like to use an inclusive IP address range for restricted access. The question is unclear, so I'm not sure if that's what you mean, but this is an example:
RewriteEngine on
RewriteCond %{REMOTE_HOST} !^XXX\.XXX\.XXX\.XXX
RewriteRule ^(.*) / [R=302,L]
Add that to a .htaccess file in the folder you'd like to protect, replace XXX.XXX.XXX.XXX with your IP address, and anyone but you will be redirected.
You'd probably want a password as well for very restricted areas.
Edit:
In place of a long comment.
Client-side scripts shouldn't have any greater access when making 'AJAX' requests than any standard request made to a publically accessible file. It's not easy to help without more info on 'why' you want to. Storing your PHP stuff outside of the document root is probably the way to go, but that stuff would then only be accessible from the server-side (e.g. PHP).
You could make an XMLHttpRequest to an accessible page, which could in turn access files stored in a non-public location. e.g., either with an absolute path /var/private/, adapted to suit, or by traversing the directory structure with e.g. ../private, meaning one directory higher where your root may be /var/www.

how to protect php file with .htaccess from downloading with php5 crashed

Last night I made some admin changes to my webserver. I use php. The php processor failed after the update and if someone went to my homepage, the php page would simply download and show the proprietary code and password to anyone visiting. So I was wondering if there is a way to prevent any form of download for php files using .htaccess -- but still allow for normal viewing of the files.
A good pattern to follow during development is to use a minimal initialization file, which invokes the actual application which resides outside the webroot. That way only a minimal stub with no critical information is exposed in a case like this.
Simplified example:
/
/app
critical_code.php
/webroot
.htaccess <- rewrites all requests to index.php
index.php <- invokes ../app/critical_code.php (or other files as requested)
The trouble here is that either .htaccess is serving your files to the user or it's not. You can't tell it to deny access to the .php files, because then access will be denied during normal use, as well. There is no fallback behavior for the PHP processor simply not running correctly.
Maybe it's worth temporarily moving the web root to point to an "under maintenance" site when doing big things like that, to minimize risk as much as possible.
Assuming you're using Apache, your .htaccess file would look something like this.
<FilesMatch ".*\.php">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
<IfModule php5_module>
<FilesMatch ".*\.php">
Allow from all
Satisfy All
</FilesMatch>
</IfModule>
The first rule denies access to all .php files. By default, the user will see a 403 (Forbidden) error.
If the PHP5 module successfully loads, the second rule will take affect, which grants access.

Categories