Last night I made some admin changes to my webserver. I use php. The php processor failed after the update and if someone went to my homepage, the php page would simply download and show the proprietary code and password to anyone visiting. So I was wondering if there is a way to prevent any form of download for php files using .htaccess -- but still allow for normal viewing of the files.
A good pattern to follow during development is to use a minimal initialization file, which invokes the actual application which resides outside the webroot. That way only a minimal stub with no critical information is exposed in a case like this.
Simplified example:
/
/app
critical_code.php
/webroot
.htaccess <- rewrites all requests to index.php
index.php <- invokes ../app/critical_code.php (or other files as requested)
The trouble here is that either .htaccess is serving your files to the user or it's not. You can't tell it to deny access to the .php files, because then access will be denied during normal use, as well. There is no fallback behavior for the PHP processor simply not running correctly.
Maybe it's worth temporarily moving the web root to point to an "under maintenance" site when doing big things like that, to minimize risk as much as possible.
Assuming you're using Apache, your .htaccess file would look something like this.
<FilesMatch ".*\.php">
Order allow,deny
Deny from all
Satisfy All
</FilesMatch>
<IfModule php5_module>
<FilesMatch ".*\.php">
Allow from all
Satisfy All
</FilesMatch>
</IfModule>
The first rule denies access to all .php files. By default, the user will see a 403 (Forbidden) error.
If the PHP5 module successfully loads, the second rule will take affect, which grants access.
Related
Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).
My university has multiple servers which have the same data mirrored across them, so I can access for instance
foo.uni.edu/file.php
bar.uni.edu/file.php
The thing is, not all servers have PHP installed, so anyone could possibly download my php files if they made the connection through a server which didn't have PHP installed.
Is there a way, possibly with .htaccess to avoid this? As in, only allow opening PHP files if PHP server is installed?
If it's possible to store files outside of the document root, you could work around the problem by storing all sensitive data outside the docroot. You would then have your publicly accessible scripts use include to access those files.
So, if you upload to /username/public_html, and public_html is your document root (eg, foo.uni.edu/file.php is /username/public_html/file.php), then you would upload to /username/file.php instead and place another script in /username/public_html which merely contains something like include('../file.php');
This is good practice in any case, in case a configuration error on the server ever stops PHP from being parsed.
You could also try using IfModule and FilesMatch to deny access to PHP files if mod_php isn't enabled:
<IfModule !mod_php.c>
<FilesMatch "\.php$">
Order Deny,Allow
Deny from All
</FilesMatch>
</IfModule>
If this doesn't work, try !mod_php5.c instead.
I'm working on a solution to a problem where users could potentially access images (in this case PDF files) stored in a folder off the server root. Normally, my application validates users through PHP scripts and sessions. What isn't happening right now is preventing non-logged in users from potentially accessing the PDFs.
The solution I'm looking for would (I think) need to be tied in with Apache. I saw an interesting solution using RewriteMap & RewriteRule, however the example involved putting this in an .htaccess file in the PDF directory. Can't do that with Apache (error: RewriteMap not allowed here). I believe the rewrite directives need to go in my httpd.conf, which I have access to.
So the example I found (that resulted in 'rewritemap not allowed here') is here:
RewriteEngine On
RewriteMap auth prg:auth.php
RewriteRule (.*) ${auth:$1}
auth.php just checks PHP session and redirects to a login script if needed.
I'm reading that I have to place this in my httpd.conf. How would I specify that the RewriteMap should only occur on a specific directory (including subdirectories)?
1st, be sure that you have to put that directly in httpd.conf. On Debian system, for instance, you have 1 file by virtualhost (a virtualhost usually is a website)
So, you have to put your rewriteMap in a "directory" like this:
<Directory /full/path/to/your/pdfs>
RewriteEngine on
...
</Directory>
I use Minify to minify and cache all my script requests. I only want my users to be able to access the minified versions of the JavaScript files.
Minify lies at www.example.com/min and my scripts are at www.example.com/scripts. How can I block direct access to doc_root/scripts which is where my unminified JavaScript files lie. I'd rather not put them out of the document root but it's an option.
Please note that I'm using Zend Framework, so the actual root of my application is shifted to www.example.com/public. An htaccess file handles the rewrite.
Can't you just use an .htaccess file inside doc_root/scripts to prevent all access over the web to .js files over HTTP?
It won't stop minify, since that provides indirect access.
So in doc_root/scripts/.htaccess, something along the lines of
<Files ~ "\.js$">
order allow,deny
deny from all
</Files>
Note that the location of the .htaccess file matters in this case.
You effectively can't block end-user facing code. Even if you served it with PHP or another server-side language and blocked direct requests, it's of course still possible to read it directly with a number of tools.
You should code with this in mind and be mindful with javascript comments, business knowledge, etc.
UPDATE:
However, if you're talking about code that doesn't ever need to be accessed by an end-user, you could as you mentioned move it out of the server root, or you can block the files in your directory (or an entire directory). It's easy with Apache's .htaccess.
order deny, allow
deny from all
You could also redirect the source files to the minified versions with mod_rewrite in your .htaccess file.
RewriteEngine On
RewriteRule /scripts/(.*)$ /min/$1 [L,NC]
Depends on the server you're using. Assuming it's Apache, you can add this to your .htaccess file:
<Directory ~ "\scripts">
Order allow,deny
Deny from all
</Directory>
Or something to that effect..
The only way is to check referers, and not everyone sends them, or sends a real one. In other words, you can't block direct access to anyone who really wants something. It's impossible to determine with 100% accuracy if a request is a direct one or is being done via a <script src=....> type request.
For your Javascript to actually run the user's browser must be able to read it ultimately.
As such there's no real way to "block" access to your scripts folder (well to be precise you can but that would break your website since the browser would not see the files in order to run them.)
One solution could be obfuscation, which makes the javascript code harder to read / understand but ultimately the user will see the code, and with a bit of persevering reverse engineering it can be de-obfuscated.
Another thing i've seen someone do is creating an "empty" js.html page, and insert all their javascript into script tags in the page (embedded, not external), and from his main page make ann ajax request to js.html and embed it at the bottom of the page. kind of a round about way but the user will not see the js when viewing the source unless using developper tools such as firebug.
Note that the last option also might cause some delay depending on the abount of code you are loading. but here the key is not blocking access to your scripts, but just making them harder to obtain / read / copy.
Edit: oops, misread as well. I think the best solution in this case would be to go with an htaccess file in your scripts folder denying all access
This answer is little bit newer, than question (only several years, that’s nothing)
You cannot deny access to JavaScript file, because they wont’t be accessible from <script> tag.
But I found a workaround:
RewriteEngine On
RewriteRule ^.*\.js$ /invalid.html [R=301,L]
Place it in your .htaccess file in your home folder of web. (under htdocs or public_html).
This will automatically redirect everyone from it. So they don’t see it.
I'm working on a site that allows users to purchase digital content and have implemented a method that attempts to serve secure downloads.
I'm using CodeIgniter to force downloads like this:
$file = file_get_contents($path);
force_download("my_file_name.zip", $file);
Of course, I make sure the user has access to the file using a database before serving the download, but I was wondering if there was a way to make these files more secure.
I'm using a some 8-10 letter keys to create the file paths so urls to the files aren't exactly easy to figure out... something like http://mysite.com/as67Hgr/asdo0980/uth89.zip in lieu of http://mysite.com/downloads/my_file.zip.
Also, I'm using .htaccess to deny directory browsing like so: Options All -Indexes.
Other than that... I have no idea what steps to take. I've seen articles suggesting username and password methods using .htaccess, but I don't understand how to bypass the password prompt that would occur using that method.
I was hoping there might be a method where I could send a username and password combination using headers and cUrl (or something similar), but I wouldn't know where to start.
Any help would be hugely appreciated. Thanks in advance!
Make it so the web server does not serve the files under any circumstances, otherwise all the checking is pretty moot. The best way to do that is to put them somewhere outside the webroot. I.e.:
/
webroot/ <- root web directory, maybe named www or similar
index.php <- your app, served normally
…other serve-able files…
files/ <- not part of the serve-able webroot dir
secret_file <- web server has no access here
Then, if the only way to access them is through your script, it's as secure as you make your script.
why not to just Deny from All in the .htaccess? Or place files above webroot? That would be enough. But your current setup is pretty safe already. Why do you think you need any help?
.htaccess should look like this if you want them to only be downloadable from your localhost. Also, it removes some handlers that that could try to access any of the files, just in case. So that way only you have access to it. Also a good idea to store an index.php file in there that checks the existance of another file, and if exists, set the header, if not, exit.
.htaccess file:
<Files *>
Order Deny,Allow
Deny from all
Allow from localhost
</Files>
RemoveHandler .php .php3 .phtml .cgi .fcgi .pl .fpl .shtml