I have made a subdomain for my web-site which I will use to store all the important scripts, both php and javascript. I want to protect evertyhing there so that it cannot be accessed from a web-browse.
I have tried .htpasswd. But when the page be called to do the function, there are password require every times.
You can say that the folder can be protected, but it makes the script not work because access requires a password.
Are there better alternatives?
Put the PHP files outside of the web-root, and have the server access/include/require them via the file-path. My own private scripts reside in the 'private' folder, which is in the same directory as the /var/www/ directory, and are accessed via: include '../private/script.php'
This won't work for JavaScript, though (except for possibly server0side JavaSCript) as it needs to be accessed by the user/client before it can be used. If it can't be accessed it can't be used, which makes it somewhat pointless. To ensure security for JS don't put anything private into the JavaScript, it's the only way; and then sanitise any input taken from that JavaScript.
You can always use .htaccess's deny from all.
See this article on .htaccess to learn more
I like to use an inclusive IP address range for restricted access. The question is unclear, so I'm not sure if that's what you mean, but this is an example:
RewriteEngine on
RewriteCond %{REMOTE_HOST} !^XXX\.XXX\.XXX\.XXX
RewriteRule ^(.*) / [R=302,L]
Add that to a .htaccess file in the folder you'd like to protect, replace XXX.XXX.XXX.XXX with your IP address, and anyone but you will be redirected.
You'd probably want a password as well for very restricted areas.
Edit:
In place of a long comment.
Client-side scripts shouldn't have any greater access when making 'AJAX' requests than any standard request made to a publically accessible file. It's not easy to help without more info on 'why' you want to. Storing your PHP stuff outside of the document root is probably the way to go, but that stuff would then only be accessible from the server-side (e.g. PHP).
You could make an XMLHttpRequest to an accessible page, which could in turn access files stored in a non-public location. e.g., either with an absolute path /var/private/, adapted to suit, or by traversing the directory structure with e.g. ../private, meaning one directory higher where your root may be /var/www.
Related
I use a .htaccess file to redirect all requests to the same index.php. From there, I decide what to do depending on URL of the incoming request.
For now, I map the request URL to directory structure relative to a source folder, so that example.com/user/create maps to source/user/create/index.php. The file located there can handle the request as needed, and generate HTML output.
But that doesn't work for static assets, the browser may request. So my idea is to find out whether a request URL ends with a file extension and map that to a directory structure relative to a assets folder, so that example.com/css/default.css would map to assets/css/default.css. But I don't know how to respond with a static file instead of HTML code from PHP.
Normally, this might be done using an additional .htaccess rule, but I'd like to change the assets folder dynamically in code.
How can I send a static file, given on the server's hard drive, to the browser client in PHP?
Well, if you really want to, you can just use readfile(). But it would be better to let the web server handle static files if you can; it's more efficient than firing up PHP just to throw a file out the door.
Also, with the PHP approach you'll probably have to set the Content-Type header appropriate per file, which can be annoying. Your web server will be doing that for you already when serving files directly.
Plus there are other things that you may be getting for "free" at the moment that you'll have to consider -- using ob_gzhandler if you want to compress your pages, as might already be being done for static files by Apache, say.
So, while it's possible, and I can see reasons why it's sometimes desirable, I'd probably try to make this kind of processing the exception rather than the rule for files that aren't generally dynamic...
On the the htacess
#Expections
RewriteCond %{REQUEST_URI} ^/assets.*$
RewriteRule ^(.*)$ - [L,NC]
#You redirect to index
...
put a / at the beginning of the file path
/assets/path/to/file
I am restyiling my own php mvc framework.
I have every request handled by /index.php by default, which triggers the mvc process of routing, executing the request and returing a proper view. Each request is routed according to a single 'q' GET parameter, drupal style, like
/index.php?q=anApplication/aController/theAction/arg1/.../moreArguments
This works pretty good, and makes the clean url thing easy via mod_rewrite. Ok.
I have a directory tree like this:
/public
|----themeName
|--------|
|----page.tpl
|----content.tpl
|----etc.
/private
|----sourceDirectory
|----moreSources
What i dont want is files stored in private and public directories to be served directly like an HTTP request: i dont want something like
mySrv/public/themeName/page.tpl
to show a dead template, or any resource that is not an image, bypassing my core handler - index.php.
I think i could achieve something with a rewrite configuration like this
RewriteEngine on
RewriteBase /
RewriteRule ^(.*)$ index.php?q=$1 [QSA,L]
but then the framework would only work on mod_rewrite-enabled sites, because this will rewrite all existing and non-existing resources.
What i am asking for is: is there another way to make EVERY request served by a resource (such an index.php) of my choice, existing or non-existing ones?
Thank you
Just store all your templates etc outside of the public_html folder.
This will allow PHP to have access, but the outside world cannot get to the file.
The easiest and more portable way would be to pull everything except your index.php out of the document root. PHP can still include files and do everything else.
I have not tried this, but if you put an index.php outside the old document tree
---/ app / new-index.php
|
/ public /
|
/ private / ...
|
index.php
and then add at the beginning of new-index.php
<?php
chdir('..');
require 'index.php';
?>
and finally reconfigure Apache so that the DocumentRoot actually becomes /app, then everything should work as before -- except that any URLs but '/' stop making sense for Apache, and all can be made to land on a suitable ErrorDocument 404.
Note: "everything should work", except HTTP redirections. You can read a non-interpreted PHP file from the FS, but you can no longer get its interpreted content from, say, Apache on localhost. Also, you ought to verify that any existing code does not make use of the DOCUMENT_ROOT global variable; if necessary you may overwrite it.
Through researching, I discovered two common techniques to prevent clients from accessing libraries directly with a browser:
Use .htaccess to keep them out
Define a constant and pass it to included files, included files then checks if the constant exists.
However, just keeping those files out of the document root seems sensible. Is there anything wrong with this approach?
The best thing to do is keep it outside of your docroot. There is no reason to put includes in a directly HTTP-accessible place.
Some shared web hosts are poorly configured and don't have this option, but most do, and you definitely have this choice on your own server or VPS.
To complete the Brad's answer, here is how you could organize your folders:
/path/to/project/
public_html/
index.php
includes/
includes.php
Your webserver's root folder would be public_html.
If you can't modify this structure, the only acceptable way is to use a .htaccess (or equivalent) to prevent includes to be accessed publicly.
If you use an Apache Webserver, you could deny the access to all .inc.php files. you only have to add the following to your Apache Vhost config:
<FilesMatch .inc.php>
Order allow, deny
deny from all
</FilesMatch>
You can still include these files in your php code.
If the library files just define classes / functions / whatever, and your server isn't configured in a way that makes it possible to view the source, then nothing would be achieved by requesting the scripts via the web server anyway. That being said, if you can you might as well store them outside of the document root.
I want to make it impossible for users to access a certain scripts running on my server.
The best way i can explain this is by explaining a brief description of how the root level of the website works.
ROOT contains 4 scripts:
index.php
home.php
error.php
results.php
In the index file, i have included and directed the user to these 3 files, on certain instances.
Now, this causes a bit of a problem, as the index file includes the neccessary controllers and any addittional scripts before the new script is included and the users' point of view changes to the new "webpage". I have done this as it provides a very quick experiance for the user, as in the the page load times have become very low.
Now these files have been set with the robots no index meta, and are removed from the sitemap. I want to go one step further, and make it impossible for users to access these scripts direcly, as in by typing http://www.mysite.com/results.php
This is because if they do they are greeted with an ugly, unfuctional page that has not had the layout variables, or main css stylesheet defined.
Here is a brief outline of index.php:
<?php
ob_start();
$activeHome = 'active';
require_once $_SERVER['DOCUMENT_ROOT'] . '/../../includes/config.php';
require_once($docRoot . '/includes/layout.php');
...
include 'home.php';
?>
This script includes any code that configures the 3 other scripts. as in
if (isset($_GET['register']))
{
header('Location: http://www.mysite.com/register/');
exit();
}
And here is the same for home/results/error.php
<?php
....
echo $head1 . $pageDetails . $head2 . $header . $pageContent . $footer;
exit;
?>
In these scripts all of the varibles except for pageDetails and pageContent are defined in the layout script, along with the main css file.
So as you can see by the setup of this website, i do not have very many options left, unless to restrict these pages by a php function which i presume would be fairly complex and more than likely over my head... I assume it would involve heavy use of global or session variables, which is not something i am all to keen about. An easier way, i thought, would be to do this via the htaccess file. But i am not all that knowlegable when it comes to htaccess commands, and i am not so sure if this is even possible going that route.
Would anyone with a bit more knowledge on something like this, like to offer their opinion or any input or advice?
You have two choices.
1. Move the files outside of the web root - This logically makes more sense. If the files aren't meant to be accessed by the web server they should be outside of the web root. This would be the preferred method as long as you (and the web server's user and group) have access to the directory that contains your php scripts. It is also more preferable because you don't need Apache specific (or even mod_rewrite specific) .htaccess rules so this will be portable to most other server flavors. A sample directory structure might looks like this:
/
/www
/index.php
/includes
/home.php
/error.php
/results.php
Just make sure that your webserver's user has access to www and includes and your webserver configuration allows you to work outside of your web root (most apache configurations only allow you to work within it).
2. Add htaccess rules to prevent those from being accessed - This is less preferable. If you want to be stubborn or you have a legitimate reason for keeping the files in your web root, this would be the way to do it on Apache.
RewriteEngine On
RewriteRule ^(home.php|error.php|results.php)$ - [L]
To prevent users from accessing /register/register.php:
RewriteEngine On
RewriteRule ^register/register.php$ - [L]
If a file should be accessed only from another script, and not directly via HTTP, then don't keep it under the HTTP server's root directory.
I use Minify to minify and cache all my script requests. I only want my users to be able to access the minified versions of the JavaScript files.
Minify lies at www.example.com/min and my scripts are at www.example.com/scripts. How can I block direct access to doc_root/scripts which is where my unminified JavaScript files lie. I'd rather not put them out of the document root but it's an option.
Please note that I'm using Zend Framework, so the actual root of my application is shifted to www.example.com/public. An htaccess file handles the rewrite.
Can't you just use an .htaccess file inside doc_root/scripts to prevent all access over the web to .js files over HTTP?
It won't stop minify, since that provides indirect access.
So in doc_root/scripts/.htaccess, something along the lines of
<Files ~ "\.js$">
order allow,deny
deny from all
</Files>
Note that the location of the .htaccess file matters in this case.
You effectively can't block end-user facing code. Even if you served it with PHP or another server-side language and blocked direct requests, it's of course still possible to read it directly with a number of tools.
You should code with this in mind and be mindful with javascript comments, business knowledge, etc.
UPDATE:
However, if you're talking about code that doesn't ever need to be accessed by an end-user, you could as you mentioned move it out of the server root, or you can block the files in your directory (or an entire directory). It's easy with Apache's .htaccess.
order deny, allow
deny from all
You could also redirect the source files to the minified versions with mod_rewrite in your .htaccess file.
RewriteEngine On
RewriteRule /scripts/(.*)$ /min/$1 [L,NC]
Depends on the server you're using. Assuming it's Apache, you can add this to your .htaccess file:
<Directory ~ "\scripts">
Order allow,deny
Deny from all
</Directory>
Or something to that effect..
The only way is to check referers, and not everyone sends them, or sends a real one. In other words, you can't block direct access to anyone who really wants something. It's impossible to determine with 100% accuracy if a request is a direct one or is being done via a <script src=....> type request.
For your Javascript to actually run the user's browser must be able to read it ultimately.
As such there's no real way to "block" access to your scripts folder (well to be precise you can but that would break your website since the browser would not see the files in order to run them.)
One solution could be obfuscation, which makes the javascript code harder to read / understand but ultimately the user will see the code, and with a bit of persevering reverse engineering it can be de-obfuscated.
Another thing i've seen someone do is creating an "empty" js.html page, and insert all their javascript into script tags in the page (embedded, not external), and from his main page make ann ajax request to js.html and embed it at the bottom of the page. kind of a round about way but the user will not see the js when viewing the source unless using developper tools such as firebug.
Note that the last option also might cause some delay depending on the abount of code you are loading. but here the key is not blocking access to your scripts, but just making them harder to obtain / read / copy.
Edit: oops, misread as well. I think the best solution in this case would be to go with an htaccess file in your scripts folder denying all access
This answer is little bit newer, than question (only several years, that’s nothing)
You cannot deny access to JavaScript file, because they wont’t be accessible from <script> tag.
But I found a workaround:
RewriteEngine On
RewriteRule ^.*\.js$ /invalid.html [R=301,L]
Place it in your .htaccess file in your home folder of web. (under htdocs or public_html).
This will automatically redirect everyone from it. So they don’t see it.