My php website have multiple php files , some of them are for user interface and some of them are helper files(the files, which communicates through database and each other to return a result). Now I need that user can't execute the helper files from their direct url's.
e.g. mydomain.com/login.php ---------- (Interface file, must be accessible to user)
mydomain.com/login_handle.php ----(Healper file, must not be accessible to user)
So I need is user can execute and browse mydomain.com/login.php but must ot be able to execute mydomain.com/login_handle.php, while login.php and handle_login.php keep communicate and can access each other. Thanks,
Edit: Sorry but I'm using shared hosting and there is no folder other than public_html.
The first things I would attempt:
Move the included files outside of the document root
Move the included files inside another folder and protect it using .htaccess. Alternatively, rename your include files to end with .inc and create a rule based on that.
Make sure the included files don't output anything; this is not really secure, but if your file only contains functions, class definitions, etc. without producing any output, it would just show an empty page.
The hackish approach for this can be accomplished by using constants:
index.php
<?php
define('MY_CONSTANT', '123');
include('helper.php');
helper.php
<?php
if (!defined('MY_CONSTANT')) { exit; }
// we were called from another file
// proceed
Edit
The number 2 approach from above can be done by:
Create a folder underneath public_html, e.g. includes/
Move all the files that should be included only into this folder
Add the following .htaccess inside:
<FilesMatch "\.php$">
Order allow, deny
Deny from all
</FilesMatch>
Try to use .htaccess.
Instead of 127.0.0.1 this ip, you need to put your server ip address.
<Files login_handle.php>
Order Allow,Deny
Deny from all
Allow from 127.0.0.1
</Files>
Related
How to deny direct access to website files if a user from out side want to open them with url but allow website files to access each other by include 'file.php' in php files.
I have a .htaccess file.Can I do this with this ?
You do this by having the PHP files outside of your webroot. Your scripts that actually need to be accessible need to be in or beneath your webroot. You typically see PHP projects these days that have a structure like this:
/projectname
/app
/src
/web
The use of composer for dependency/component library management is the state of the art these days and if you are using it it will create other directories like vendor.
So your front controller or other web accessible scripts go into projectname/web and this is what you set your webroot to.
Your other scripts go into the /projectname/src.
Your include/require statements need a filesystem path, so you can reference them either via relative addressing or using a full path.
Typically people will have a bootstrapping include or use a front controller (everything goes through index.php) where include paths are setup. With component libraries you also want your class loader to be instantiated to resolve any libraries you might be using in your project.
Again the use of composer is highly recommended and will generate your class loader for you, and then it's just a matter of making sure it is included.
This .htaccess-file will only allow users to open index.php. Attempts to access any other files will result in a 403-error.
Order deny,allow
Deny from all
<Files "index.php">
Allow from all
</Files>
If you also want to use authentication for some of the files, you may simply add the content from your current file at the end of my example.
I'm in a situation wherein I have file includes but I don't want other people going on the "includes" directory and viewing the pages individually via browser.
I'm quite familiar with how to approach it via inside the PHP files themselves but I want to use the .htaccess for preventing it this time.
So how do I configure .htaccess to prevent users NOT coming from a certain referrer from viewing the PHP files inside the "includes" folder?
.htaccess will work, but just to suggest an alternative - why not move your include directory outside the webroot? Your scripts can still access it, and there's no need to configure apache to deny access.
Put a .htaccess file in the directory you would like to not be viewed and put this in there:
order allow, deny
deny from all
This is the simple block all approach. More info on how to block by referer can be found here.
Hope this helps.
As lot of web hosting solutions explicitly limit you to working within the a public_html (or equiv) hierarchy. So I use a simple convention: if I don't want a file or directory to be private -- that is not accessible through a URI -- then I prefix its name with either a "_" or a ".", for example my PHP includes directory is called "_includes".
I use this pattern in my .htaccess files to enforce this:
SetEnvIf Request_URI "(^_|/_|^\.|/\.)" forbidden
<Files *>
Order allow,deny
Allow from all
Deny from env=forbidden
</Files>
You can use this approach, but modify the regexp to whatever suits your personal convention. One advantage is that it works with this template in your DOCROOT .htaccess file. You don't need to have .htaccess files in the restricted subdirectories.
:-)
On my site i use a lot of includes, the most of the includes should only be accessible for the webserver and not for the rest of the world. So if i include "../include_map/file.php" in a page on my site, it should not be possible to request with an URL by other users in the world ("website.com/include_map/file.php"). Is there a possibility to protect the map with the include files so that only the webserver can include the files?
PHP can include files from everywhere (also non public directories) on the servers harddrive.
for example, if your htdocs is located in /var/www/domain/htdocs/ you can also include files located in /var/www/domain/include_map while the webserver wont be allowed to read from there (if configured properly).
you can then test to access the file with www.yourdomain.com/../include_map/file.php.
if you can still access it like this, your webservers configuration needs some attention to prevent others from reading your logs and other things.
another way is to deny access to the directory via .htaccess or apache config. php can still include the files, while users cant access them from the internet.
in the apache config you would do something like:
<Directory /inlcude_map>
Order Deny,Allow
Deny from all
</Directory>
in a .htaccess file you could write
Order Deny,Allow
Deny from all
the .htaccess file should be located in the directory you want to secure. Consult your server provider to find out which way is best for you. As stated in the comment you have to find out if .htaccess is an option for you first.
You could do as zuloo said. If you want this to work under any condition you could use a constant for this.
The file including:
define('IS_APP', true);
require_once('some/file/to/include.php');
// your code here
The included file:
if(!defined('IS_APP')) {
die('No direct access');
}
// your code here
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to hinder PHP files to Global Access
For a project I'm working on, I require("xxxx.php"); to include certain parts of my website into my site.
For example, I have header.php and my site just uses a simple require("header.php") to display to the world.
Now, how do I make it so the page include header.php's content, but make it so the user can't access it via
http://mywebsite.com/header.php
Put the file somewhere outside the DocumentRoot of the webserver.
Setup the webserver so that:
You have a public directory, where your accessible files reside, say static media + index.php and so on
Have a resources directory, that is outside your public folder.
Setup the webserver to serve from the public directory
Include like this:
require("../private/header.php");
You could define a constant in your index file and then in your header.php you would check so this constant exists.
// check if the header.php file is accessed directly
if (!define('MY_SECRET_CONSTANT')) {
exit();
}
You can either work with htaccess to ensure that special files are not viewed. The same can be achieved through a virtualhost entry.
A best pratice is to create a /web folder which is "the root" for your application and store everything which should be accessable. Other folders go anywhere else and are included into scripts which are desgined to get accessed by the user.
Your structure could look like this:
/var/www/mySite/inc/header.php
/var/www/mySite/web/index.php (including ../inc/header.php)
/var/www/mySite/web/css/style.css
Your apache virtual host would look like this:
<VirtualHost *:80>
ServerName www.mysite.com
# Basic stuff
DocumentRoot "/var/www/mySite/web"
DirectoryIndex index.php
<Directory "/var/www/mySite/web">
AllowOverride All
Allow from All
</Directory>
</VirtualHost>
Now the inc folder cannot be accessed through your domain as your web folder is the root for everyone comming from the url. Your scripts on the other hand can of course navigate lower than that and including scripts from anywhere.
As you can see from the answers: there's a gazillion ways to do this, here's two pretty simple ones that do not require you to change your directory structure:
In index.php and other files that include header.php:
define('INCLUDED', true);
require 'header.php';
In header.php:
if(!defined('INCLUDED')){
die('Forbidden'); // or you could redirect to home ... whatever you want :)
}
Alternatively, you can forbid access via .htaccess. Thus, you don't even have to touch your code:
<Files header.php>
order allow,deny
deny from all
</Files>
If your includes are all in one directory, you can simply deny access to that directory.
I've made a very small CMS myself. After login a session is set.
The CMS includes certain images, php pages, etc.
These pages may also include forms to add data to the database.
Now the problem is that you actually can use an address to get to the page which shows the form, ie;
domain.com/mycms/includes/addpage.php
How would you suggest to protect this?
NOTE: when I am logged in everything must work, just from outside it may not show the form. I could check if the session exists but I wonder if there are better and easier ways.
First of all, if you are including PHP files, you really should not place them inside your public web root.
If this is not possible, an alternative approach would be to define a constant in your index.php (assuming you use this as a main entry point) and checking wether this constant is set in every include file in order to prevent direct access to these files.
For example:
// index.php:
define('INDEX_LOADED', true);
// /includes/addpage.php:
if (!defined('INDEX_LOADED')) die('no direct access allowed');
Aim to put your files in
domain.com/private/includes/addpage.php
And then from your page do
something like
include('../private/includes/addpage');
I always use extension .inc.php for PHP files that should not be accessed from outside. Then I deny that extension to be visible from outside. For apache you can do this in .htaccess file in main directory:
<Files ~ "\.inc\.php$">
Order allow,deny
Deny from all
</Files>
Also if you use some framework or you have a class (or include) directory you can deny access to the whole directory like this (apache):
<Location ~ "^/(classes|framework)"
Order allow,deny
Deny from all
</Location>
Other web servers have other ways to forbid files. If you want it universal and portable - the Aron Rotteveel's suggestion is the best.
You can leave files that only contain classes declarations unprotected - if they are run from outside no code will run. Make sure that php ini setting display_errors is off for the host
If it necessary to keep private files inside public folder you can protect it with CHMOD permissions like 700