On my site i use a lot of includes, the most of the includes should only be accessible for the webserver and not for the rest of the world. So if i include "../include_map/file.php" in a page on my site, it should not be possible to request with an URL by other users in the world ("website.com/include_map/file.php"). Is there a possibility to protect the map with the include files so that only the webserver can include the files?
PHP can include files from everywhere (also non public directories) on the servers harddrive.
for example, if your htdocs is located in /var/www/domain/htdocs/ you can also include files located in /var/www/domain/include_map while the webserver wont be allowed to read from there (if configured properly).
you can then test to access the file with www.yourdomain.com/../include_map/file.php.
if you can still access it like this, your webservers configuration needs some attention to prevent others from reading your logs and other things.
another way is to deny access to the directory via .htaccess or apache config. php can still include the files, while users cant access them from the internet.
in the apache config you would do something like:
<Directory /inlcude_map>
Order Deny,Allow
Deny from all
</Directory>
in a .htaccess file you could write
Order Deny,Allow
Deny from all
the .htaccess file should be located in the directory you want to secure. Consult your server provider to find out which way is best for you. As stated in the comment you have to find out if .htaccess is an option for you first.
You could do as zuloo said. If you want this to work under any condition you could use a constant for this.
The file including:
define('IS_APP', true);
require_once('some/file/to/include.php');
// your code here
The included file:
if(!defined('IS_APP')) {
die('No direct access');
}
// your code here
Related
I've been pulling my hair off trying to find a solution for denying access to a file for all AJAX/ PHP/ in-browser requests EXCEPT when they come from one (other) directory/ file on the same server.
I have a .htaccess in the directory of the JSON file. The file is fetched by a CMS plugin as follows (pretty standard):
Javascript makes an AJAX call to a PHP file on the server, called handler.php.
handler.php then retrieves the contents of the file and returns it to JS.
I can't use rewrite rules, as it is not a prereq. for the CMS. I also can't set a known domain name, as it is dynamic. Ideally I would do Allow from SELF.
I have tried the following .htaccess configs without luck:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath>
Or:
Order Deny, Allow
Deny from all
Allow From <ownDomainName/subpath/fileName.ext>
I tried with setEnv directive too, and a mixture of others involving using <Files>, <Directory> and <Location>. If I just Deny from All the PHP cannot access it. Is there any solution for restricting access to a JSON file?
Also, does it have anything to do with the fact that I am testing on localhost?
There are 2 types of access to the file.
From web, where .htaccess rules or server (apache) config is applied.
By app or script running on the server machine, where filesystem permissions are applied.
So you can deny all requests in .htaccess file, but PHP script still can access the file via FS (dependent on FS permissions).
Many people Make their website backup.zip on their hosting server,
A zip file are place on same directory where Index.php exists.
So if i use this link my backup will download http://www.example.com/backup.zip
If I don't share my backup filename/link, is it safe from hackers or robots?
Is there any function that give my all files and directory name?
How to secure backup.zip file on hosting server?
I post this question here because I think Developers know best about
Hacking / robots attack / get directory / get files from another server
There is many way to protect your files from the eyes of internet.
The simplest one is to have a index.html, index.html, or index.php file, into the directory who contain your backup.zip, but the file still can be acceded if someone guess his name and call it from his URL like this: www.example.com/backup.zip
To avoid this issue: most of the webservers provide a way to protect your file. If we assume you are under Apache2 you must create a rule into a .htaccess file (who is located into the same directory of your file) to prevent people from accessing your backup.zip.
e.g:
<Files backup.zip>
Order allow,deny
Deny from all
</Files>
if you are not under Apache2, you could find the answer by checking the documentation of your HTTP server.
Your file is not safe as it is, as /backup.zip is the most obvious path that hackers can guess.
So to protect the zip file from unauthorised access, move it to the separate folder and create .htaccess with the following content:
Deny from all
# Turn off all options we don't need.
Options None
Options +FollowSymLinks
To make this work, your Apache needs to use mod_rewrite with option AllowOverride All for that folder to allow the .htaccess file to be run (which usually it is configured by default).
I have several folders in my website directory which contains textfiles called "secure.txt".
For security reasons the URL of these files are never shown in the webbrowser but my website searches for these files (PHP code), which contains sensitive information.
How can I make the PHP code allowed to read these files but restrain access through the url, so a potential hacker wouldn't be able to read the content of these files?
put them out side your document root folder and place the following .htaccess file in the folder you want to protect. Also if you don't want to access it through a particular IP remove the last line.
order deny, allow
deny from all
allow from 192.168.0
[EDIT:]
To allow php scripts, etc. allow localhost (127.0.0.1)
order deny, allow
deny from all
allow from 127.0.0.1
You should put them in another folder and make the .htaccess deny from all, allow from 127.0.0.1
Old trick for that: Prefix the files with <?php die("I'm a fraid I can't do that, Jim"); ?>, and call them *.php. On parsing, ignore the prefix.
Edit
Why do this? The rationale behind it is, that you avoid a dependency on some special webserver configuration, which acn be forgotten (on moving to a new server), unavailable (many cheap hosts don't give you .htaccess), not applicable to some webserver software (IIS) etc.
So the reasoning is to trade some computational overhead against flexibility, portability and platform independence.
Can you move them out of your website directory altogether? If so, then make sure PHP has access to that directory! (The open_basedir value will need to include it.)
I'd suggest moving the files out of the webroot to be on the safe side
If you use Apache, deny access to all files named secure.txt from within httpd.conf:
<Files secure.txt>
deny from all
</Files>
You may do the same via .htaccess files as well (if your server is configured to allow override access rights from htaccess).
But a better solution would be to include the sensitive information into your PHP scripts themselves, or to store it in a database.
I have all of my database credentials within an include file that I wanted to place outside of my webroot file.
However, my shared hosting plan does not allow me to place files outside of the webroot. Would I have to look at encrypting my file in some way to make sure my credentials are secure?
I had read a method to produce a kind of fake 404 page, but that doesnt sound very secure to me at all.
I've also taken the step of creating a read-only user account so that if my account is compromised then at least nothing can be overwritten or dropped, but I obviously want to be as secure as I can given the limitations.
You can't
Best what is possible is create php file which will be interpreted by hosting service.
<?php
$DB_USER = 'your_user';
$DB_PASS = 'your_pass';
$DB_INSTANCe= 'your_instance';
When someone will access your file from web browser he won't see anything. When you need your file just include it.
You could also add some .htaccess (probably) so no one using web browser will be able to access your file.
Someone who has read access to the same physical host as you will be sadly able to access this file, and there is no way to prevent that.
If the server is running apache and you are allowed to override the directives then this could be achieved using by creating a .htaccess file in the webroot with the following lines, be sure to replace <FILENAME> (including the <>) with the name of the file you would like to deny access to.
#Deny access to the .htaccess file
<Files ~ "^\.htaccess">
Order allow,deny
Deny from all
</Files>
#Deny the database file
<Files ~ "^\<FILENAME>$">
Order allow,deny
Deny from all
</Files>
I'm in a situation wherein I have file includes but I don't want other people going on the "includes" directory and viewing the pages individually via browser.
I'm quite familiar with how to approach it via inside the PHP files themselves but I want to use the .htaccess for preventing it this time.
So how do I configure .htaccess to prevent users NOT coming from a certain referrer from viewing the PHP files inside the "includes" folder?
.htaccess will work, but just to suggest an alternative - why not move your include directory outside the webroot? Your scripts can still access it, and there's no need to configure apache to deny access.
Put a .htaccess file in the directory you would like to not be viewed and put this in there:
order allow, deny
deny from all
This is the simple block all approach. More info on how to block by referer can be found here.
Hope this helps.
As lot of web hosting solutions explicitly limit you to working within the a public_html (or equiv) hierarchy. So I use a simple convention: if I don't want a file or directory to be private -- that is not accessible through a URI -- then I prefix its name with either a "_" or a ".", for example my PHP includes directory is called "_includes".
I use this pattern in my .htaccess files to enforce this:
SetEnvIf Request_URI "(^_|/_|^\.|/\.)" forbidden
<Files *>
Order allow,deny
Allow from all
Deny from env=forbidden
</Files>
You can use this approach, but modify the regexp to whatever suits your personal convention. One advantage is that it works with this template in your DOCROOT .htaccess file. You don't need to have .htaccess files in the restricted subdirectories.
:-)