I want to disallow access to all files and folders in protected directory on my server.
I have used this in .htaccess file in that directory:
Deny from all
But the problem is I also want to use curl from another website to query one file in this directory:
$curl = curl_init();
$post_data = array();
$post_data['token'] = $token;
curl_setopt($curl, CURLOPT_POST, true);
$url = 'https://my-website.com/protected-dir/file.php';
curl_setopt($curl, CURLOPT_URL, $url );
$response = curl_exec($curl);
if (curl_errno($curl)) {
return(curl_errno($curl));
} else{
return $response;
}
curl_close($curl);
Now I get error 22 from CURL. I want to protect direct access to all files and folders in that directory. What are my options?
I can place this directory outside of web root, but how would I use CURL then? (what is the URL to that file?)
other solutions?
The easy answer is to leave the file inside the document root and then just add an allow exception to the .htaccess file, which lists the IP address of the remote server you want to allow hits from:
allow from 1.2.3.4
deny from all
But note this will allow any user on that machine to grab the file. The secure answer is to always put sensitive content outside the document root and never rely on .htaccess to protect you. Then create an authenticated proxy page for any exceptions.
public/
index.php
secret.php
private/
secret_file
Where public is the document root, index.php is your regular site, and secret.php would be something like:
// run some checks to ensure that you're allowed to do this:
// maybe check source IP
// maybe check for a username and password
// maybe check for a specific bearer token in a header
header('Content-Type: whatever');
readfile('/path/to/private/secret_file');
It would seem (from comments) that this code is part of a WordPress plugin that can potentially be installed on any website. In which case this file needs to be "public".
The file shouldn't be in the protected directory to begin with. However, you can block everything except the file in question using a <FilesMatch> container and a regex/negative lookahead.
For example:
<FilesMatch "^(?!file\.php$).*">
Require all denied
</FilesMatch>
Everything is blocked except for requests to file.php. Note that the Deny directive is formerly deprecated and you should be using the corresponding Require directive instead on Apache 2.4+. (But note that you should not mix both old and new authentication directives.)
I assuming that your script contains the usual validation, for example:
fails early on any non-POST request.
fails on any POST request that does not contain a suitable token parameter.
enforces a "delay" on any failed token requests to prevent brute force attacks.
You could also set a custom User-Agent string as part of the CURL (POST) request and check for this in your script or earlier in .htaccess - but that's just smoke and mirrors and doesn't really offer any additional "protection".
Related
I am working in a website that requires blocking access to internauts via URL to a directory that holds different type of content (images, css files, js files, etc). This content will be used in the website in certain occasions when it is required and only for a few seconds (advertising), therefore the website will require to access this directory, however the client does not want the content of that directory to be available via URL.
I have tried doing it through htaccess, by validating the user with a password. This forbids internauts entering this directory if they don’t have the password, however this method also block the website when it needs to use some of the content within this directory.
Thanks for reading.
You might want to put an appropriate .htaccess file inside of any folder you want to restrict and avoid mod_rewrite
#
# Restrict access by IP address
#
Order Allow,Deny
Allow from 127. # localhost
Be sure to confirm your host IP and update accordingly if it's not the default localhost
Hi i'm new in web programming. I'm developing a web site with a PHP backend application. I'm using Ubuntu 14.04 Server, Apache, PHP 5.5 and Mysql.
Currently this is my directory structure under /var/www/html:
example.com
app/ # this dir contains the backend
src/ # this contains some common stuffs between front and back ends
web/ # this is the 'public directory' which serves the frontend
I searched so much about .htaccess, but i cant point out a definitive solution to secure all .php files which are not into the web/ directory. Also, i would "hide" .php files through url rewriting (for example, instead of serve mysite.org/accounts.php i would serve mysite.org/accounts, but not just removing the .php extensions, rather redirecting mysite.org/accounts to a file called, to say, youwillneverknowthis.php).
Thanks in advance.
J.
To protect your PHP files, do not put them in the public html directory because it's accessible via the internet. Files that contain private code should be taken out of that directory. Put them in private directories such as /var/www/php/src and var/www/php/app
Now you want users to be able to get "page.php" without exposing the sensitive PHP code. One solution is to have a script in the public directory /var/www/html called gateway.php that looks in the private directory dir for a file named file and execute its contents. The values of $_GET come from step 4 below. Before requiring the file, be sure to sanitize input to prevent a malicious actor from making gateway.php run a bad script. For instance, you can compare the requested file against a whitelist of allowed files and stop execution if there is no match.
<?php #gateway.php
if (empty($_GET['file'])):
//the file is missing
http_response_code(400); //bad request
exit;
else:
$file = realpath(rawurlencode("/var/www/php/$_GET[dir]/$_GET[file]"));
//SAFE_SCRIPTS is a string constant: whitelist of allowed files
//eg:define("SAFE_SCRIPTS",'"path/to/a.php","path/to/b.php"')
if(strpos(SAFE_SCRIPTS,$file)===false):
//script is not in the whitelist
http_response_code(403);//Forbidden
trigger_error("Forbidden script: $file");
die("Access to this resource is forbidden");
endif;
require $file;
endif;
Your next issue was about url rewriting so that mysite.org/accounts could redirect to youwillneverknowthis.php). Place a text file named .htaccess in the public document root var/www/html. Add the following rule (one rule per line): RewriteEngine on. This turns on url rewriting
To redirect a url, add the following rule (single line): RewriteRule ^accounts$ gateway.php?dir=src&file=youwillneverknowthis.php [QSA,L] Adjust the dir and file arguments as required. This will redirect the user to gateway.php, which will then read and execute the private code, all behind the scenes. The Query String Append [QSA] flag ensures that any pre-existing query arguments are passed to gateway.php along with dir and file arguments. The [L] flag lets the rewriting engine know to stop processing and redirect upon finding a match.
Hope that helps.
Edit: improve security by sanitizing input
I'm using PHP's virtual() function to perform a sub-request to Apache, in order initiate a file download (the files can be heavy, so I can't use readfile()). The files in question are stored in a non-public directory, since the user's permissions need to be checked before allowing a download.
My project is structured as follows:
project/
.htaccess -----> rewrite everything to index.php
index.php -----> framework + application logic + permissions check
private/
.htaccess -----> deny from all
... -----> private files
The first problem I encountered was that the sub-request generated by virtual() was getting rewritten by the first .htaccess, so the download never started. This was easy to fix, since there is a [NS] (no sub-request) rewrite flag that allows not to rewrite URLs in such cases:
...
RewriteRule ^ index.php [NS]
...
But I still can't make it work because of the other .htaccess (deny from all), that simply rejects all requests and sub-requests.
The question: is there any way to configure .htaccess to deny access from all, except when the request is actually a sub-request coming from the server itself?
First of all, a very interesting problem. Had to dig a bit to figure it out.
You can utilize apache_setenv function here.
Have this PHP code before virtual call:
apache_setenv('internal', '1'); // sets an Apache var with name internal
virtual ( "/private/file.txt" ); // example sub request
exit;
Now inside /private/.htaccess have this snippet:
Order deny,allow
Deny from all
Allow from env=internal
This will deny all requests except when it there is an env variable internal is set to 1. That internal variable is only getting set in your PHP code hence only sub-requests will be allowed and all others will be denied.
I have below directory structure in my website,
/public_html
/public_html/admin/
/public_html/admin/js/
/public_html/admin/css/ ....
Basically I want to disallow all direct access to files inside sub folders of /admin, like not allow to access js inside /js/ folder directly but allow it to access from my php page
I added .htaccess file with below code inside /js/ folder,
Order deny,allow
Deny from all
so it is good that it won't allow me to access via browser directly !
BUT when I try to access index.php page in which files of /js/ folder are included using tag, it is not loading up.
So can anyone help me out !
Thanks in advance.
You are not accessing it "from your PHP page". The web server is either serving a request or it isn't. When you load your "PHP page" in a browser, the browser will then go out and request all the Javascript, CSS and image assets linked to from your page. Each of these will be a separate HTTP request to the web server. The web server has no context that this HTTP request is because the asset is linked to from "your PHP page", that's completely irrelevant to it. You can either get the file from the server using an HTTP request, or you can't. And by setting Deny from all, you can't.
You'd have to funnel all requests through a PHP file which checks the login information and only serves the file if the user is properly logged in. E.g. link to scripts.php?file=js/myscript.js and have authentication checking code in scripts.php.
When you restrict a folder like this, you can not include the files that are in it inside your HTML page. It is basically the same request as if the person is accessign directly to the JS by URL.
Usually cpanel tools like
Hotlink Protection
Hotlink protection prevents other websites from directly linking to files (as specified below) on your website. Other sites will still be able to link to any file type that you don't specify below (i.e., HTML files). An example of hotlinking would be using an <img> tag to display an image from your site somewhere else on the Web. The end result is that the other site is stealing your bandwidth.
Index Manager
The Index Manager allows you to customize the way a directory will be viewed on the web. You can select between a default style, no indexes, or two types of indexing. If you do not wish for people to be able to see the files in your directory, choose "No Indexing".
1&2 are tools from usual hosting cpanel. Probably it writes over apache conf files(not sure which ones)
However, you should also be aware of HTTP referer. You could based on this decide when not to show your respurce.
`HTTP referer is an HTTP header field that identifies the address of the webpage (i.e. the URI or IRI) that linked to the resource being requested`
I have a script on "domain 1" which performs a simplexml_load_file on an XML file on "domain 2" like so:
$url = 'domain2.com/file.xml';
$xml = simplexml_load_file($url);
$xmlEmail = $xml->xpath("//users/user/email");
$userEmail = implode(", ", (array) $xmlEmail);
However, this XML file contains private info about users. I don't want it to be directly accessible. Domain 2 will also be different according to the user so I can't filter by actual domains or IP's I'm afraid.
Is it possible to set permissions to make it non accessible to the public, but somehow still readable via the simplexml call? Any alternative methods perhaps?
Set HTTP basic authentication for your web-server on "domain 2". Change the url for XML file like this:
$url = 'http://my_username:my_password#domain2.com/file.xml';
$xml = simplexml_load_file($url);
mod_auth_basic for Apache
Add following to your .htaccess file, that resides in the same folder as file.xml:
<Files file.xml>
Order Deny,Allow
Deny from all
Allow from domain2.com
</Files>
This will tell your web server to disable access to this file, except if request comes from domain2.com.
UPDATE:
According to what question owner told, this xml file will be accessed from different domains. In this case the only reasonable option I might think of is a password. galymzhan already provided the answer, but I will try extend his answer and to provide a working solution.
Add following to your .htaccess file, that resides in the same folder as file.xml:
.htaccess
<Files file.xml>
AuthType Basic
AuthName "Protected Access"
AuthUserFile /full/path/to/.htpasswd
Require valid-user
</Files>
Please note, it's not required to have a shell access to the server. you can upload .htpasswd file via ftp, and change it's permissions so it will be not read/write by web server group.
More info about password protection, and some examples, can be found here.