How to detect the path to the application root? - php

I'm trying to dynamically detect the root directory of my page in order to direct to a specific script.
echo ($_SERVER['DOCUMENT_ROOT']);
It prints /myName/folder/index.php
I'd like to use in a html-file to enter a certain script like this:
log out
This seems to be in bad syntax, the path is not successfully resolved.
What's the proper approach to detect the path to logout.php?
The same question in different words:
How can I reliably achieve the path to the root directory (which contains my index.php) from ANY subdirectory? No matter if the html file is in /lib/subfolder or in /anotherDirectory, I want it to have a link directing to /lib/logout.php
On my machine it's supposed to be http://localhost/myName/folder (which contains index.php and all subdirectories), on someone else's it might be http://localhost/project
How can I detect the path to application root?

After some clarification from the OP it become possible to answer this question.
If you have some configuration file being included in all php scripts, placed in the app's root folder you can use this file to determine your application root:
$approot = substr(dirname(__FILE__),strlen($_SERVER['DOCUMENT_ROOT']));
__FILE__ constant will give you filesystem path to this file. If you subtract DOCUMENT_ROOT from it, the rest will be what you're looking for. So it can be used in your templates:
log out

Probably you are looking for the URL not the Path
log out
and you are not echoing the variable in your example.

Your DOCUMENT_ROOT is local to your machine - so it might end up being c:/www or something, useful for statements like REQUIRE or INCLUDE but not useful for links.
If you've got a page accessible on the web - linking back to a document on C: is going to try and get that drive from the local machine.
So for links, you should just be able to go /lib/logout.php with the initial slash taking you right to the top of your web accessible structure.
Your page, locally - might be in c:/www/myprojects/project1/lib/logout.php but the site itself might be at http://www.mydomain.com/lib/project.php

Frameworks like Symfony offer a sophisticated routing mechanism which allows you to write link urls like this:
log out
It has tons of possibilities, which are described in the tutorial.

Try this,
log out
This jumps to the root directly.

DOCUMENT_ROOT refers to the physical path on the webserver. There is no generic way to detect the http path fragment. Quite often you can however use PHP_SELF or REQUEST_URI
Both depend on how the current script was invoked. If the current request was to the index.php in a /whatever/ directory, then try the raw REQUEST_URI string. Otherwise it's quite commonly:
<?= dirname($_SERVER["SCRIPT_NAME"]) . "/lib/logout.php" ?>
It's often best if you use a configurable constant for such purposes however. There are too many ifs going on here.

I'm trying to figure this out for PHP as well. In asp.net, we have Request.ApplicationPath, which makes this pretty easy.
For anyone out there fluent in PHP who is trying to help, this code does what the OP is asking, but in asp.net:
public string AppUrl
{
get
{
string appUrl = Request.Url.GetLeftPart(UriPartial.Authority) + Request.ApplicationPath;
if (appUrl.Substring(appUrl.Length - 1) != "/")
{
appUrl += "/";
}
// Workaround for sockets issue when using VS Built-int web server
appUrl = appUrl.Replace("0.0.0.0", "localhost");
return appUrl;
}
}
I couldn't figure out how to do this in PHP, so what I did was create a file called globals.php, which I stuck in the root. It has this line:
$appPath = "http://localhost/MyApplication/";
It is part of the project, but excluded from source control. So various devs just set it to whatever they want and we make sure to never deploy it. This is probably the effort the OP is trying to skip (as I skipped with my asp.net code).
I hope this helps lead to an answer, or provides a work-around for PHPers out there.

Related

How to keep a php file from being executed by typing the exact page url [duplicate]

I have a php file which I will be using as exclusively as an include. Therefore I would like to throw an error instead of executing it when it's accessed directly by typing in the URL instead of being included.
Basically I need to do a check as follows in the php file:
if ( $REQUEST_URL == $URL_OF_CURRENT_PAGE ) die ("Direct access not premitted");
Is there an easy way to do this?
Add this to the page that you want to only be included
<?php
if(!defined('MyConst')) {
die('Direct access not permitted');
}
?>
then on the pages that include it add
<?php
define('MyConst', TRUE);
?>
The easiest way for the generic "PHP app running on an Apache server that you may or may not fully control" situation is to put your includes in a directory and deny access to that directory in your .htaccess file. To save people the trouble of Googling, if you're using Apache, put this in a file called ".htaccess" in the directory you don't want to be accessible:
Deny from all
If you actually have full control of the server (more common these days even for little apps than when I first wrote this answer), the best approach is to stick the files you want to protect outside of the directory that your web server is serving from. So if your app is in /srv/YourApp/, set the server to serve files from /srv/YourApp/app/ and put the includes in /srv/YourApp/includes, so there literally isn't any URL that can access them.
I have a file that I need to act differently when it's included vs when it's accessed directly (mainly a print() vs return()) Here's some modified code:
if(count(get_included_files()) ==1) exit("Direct access not permitted.");
The file being accessed is always an included file, hence the == 1.
1: Checking the count of included files
if( count(get_included_files()) == ((version_compare(PHP_VERSION, '5.0.0', '>='))?1:0) )
{
exit('Restricted Access');
}
Logic: PHP exits if the minimum include count isn't met. Note that prior to PHP5, the base page is not considered an include.
2: Defining and verifying a global constant
// In the base page (directly accessed):
define('_DEFVAR', 1);
// In the include files (where direct access isn't permitted):
defined('_DEFVAR') or exit('Restricted Access');
Logic: If the constant isn't defined, then the execution didn't start from the base page, and PHP would stop executing.
Note that for the sake of portability across upgrades and future changes, making this authentication method modular would significantly reduce the coding overhead as the changes won't need to be hard-coded to every single file.
// Put the code in a separate file instead, say 'checkdefined.php':
defined('_DEFVAR') or exit('Restricted Access');
// Replace the same code in the include files with:
require_once('checkdefined.php');
This way additional code can be added to checkdefined.php for logging and analytical purposes, as well as for generating appropriate responses.
Credit where credit is due: The brilliant idea of portability came from this answer. However there is one con to this method. Files in different folders may require different addresses to address this file. And server root based addressing may not work if you're running the current website from within a subfolder of the main site.
3: Remote address authorisation
// Call the include from the base page(directly accessed):
$includeData = file_get_contents("http://127.0.0.1/component.php?auth=token");
// In the include files (where direct access isn't permitted):
$src = $_SERVER['REMOTE_ADDR']; // Get the source address
$auth = authoriseIP($src); // Authorisation algorithm
if( !$auth ) exit('Restricted Access');
The drawback with this method is isolated execution, unless a session-token provided with the internal request. Verify via the loop-back address in case of a single server configuration, or an address white-list for a multi-server or load-balanced server infrastructure.
4: Token authorisation
Similar to the previous method, one can use GET or POST to pass an authorization token to the include file:
if($key!="serv97602"){header("Location: ".$dart);exit();}
A very messy method, but also perhaps the most secure and versatile at the same time, when used in the right way.
5: Webserver specific configuration
Most servers allow you to assign permissions for individual files or directories. You could place all your includes in such restricted directories, and have the server configured to deny them.
For example in APACHE, the configuration is stored in the .htaccess file. Tutorial here.
Note however that server-specific configurations are not recommended by me because they are bad for portability across different web-servers. In cases like Content Management Systems where the deny-algorithm is complex or the list of denied directories is rather big, it might only make reconfiguration sessions rather gruesome. In the end it's best to handle this in code.
6: Placing includes in a secure directory OUTSIDE the site root
Least preferred because of access limitations in server environments, but a rather powerful method if you have access to the file-system.
//Your secure dir path based on server file-system
$secure_dir=dirname($_SERVER['DOCUMENT_ROOT']).DIRECTORY_SEPARATOR."secure".DIRECTORY_SEPARATOR;
include($secure_dir."securepage.php");
Logic:
The user cannot request any file outside the htdocs folder as the links would be outside the scope of the website's address system.
The php server accesses the file-system natively, and hence can access files on a computer just like how a normal program with required privileges can.
By placing the include files in this directory, you can ensure that the php server gets to access them, while hotlinking is denied to the user.
Even if the webserver's filesystem access configuration wasn't done properly, this method would prevent those files from becoming public accidentally.
Please excuse my unorthodox coding conventions. Any feedback is appreciated.
The best way to prevent direct access to files is to place them outside of the web-server document root (usually, one level above). You can still include them, but there is no possibility of someone accessing them through an http request.
I usually go all the way, and place all of my PHP files outside of the document root aside from the bootstrap file - a lone index.php in the document root that starts routing the entire website/application.
An alternative (or complement) to Chuck's solution would be to deny access to files matching a specific pattern by putting something like this in your .htaccess file
<FilesMatch "\.(inc)$">
Order deny,allow
Deny from all
</FilesMatch>
Actually my advice is to do all of these best practices.
Put the documents outside the webroot OR in a directory denied access by the webserver
AND
Use a define in your visible documents that the hidden documents check for:
if (!defined(INCL_FILE_FOO)) {
header('HTTP/1.0 403 Forbidden');
exit;
}
This way if the files become misplaced somehow (an errant ftp operation) they are still protected.
I had this problem once, solved with:
if (strpos($_SERVER['REQUEST_URI'], basename(__FILE__)) !== false) ...
but the ideal solution is to place the file outside of the web-server document root, as mentioned in another anwser.
I wanted to restrict access to the PHP file directly, but also be able to call it via jQuery $.ajax (XMLHttpRequest). Here is what worked for me.
if (empty($_SERVER["HTTP_X_REQUESTED_WITH"]) && $_SERVER["HTTP_X_REQUESTED_WITH"] != "XMLHttpRequest") {
if (realpath($_SERVER["SCRIPT_FILENAME"]) == __FILE__) { // direct access denied
header("Location: /403");
exit;
}
}
You'd better build application with one entrance point, i.e. all files should be reached from index.php
Place this in index.php
define(A,true);
This check should run in each linked file (via require or include)
defined('A') or die(header('HTTP/1.0 403 Forbidden'));
debug_backtrace() || die ("Direct access not permitted");
My answer is somewhat different in approach but includes many of the answers provided here. I would recommend a multipronged approach:
.htaccess and Apache restrictions for sure
defined('_SOMECONSTANT') or die('Hackers! Be gone!');
HOWEVER the defined or die approach has a number of failings. Firstly, it is a real pain in the assumptions to test and debug with. Secondly, it involves horrifyingly, mind-numbingly boring refactoring if you change your mind. "Find and replace!" you say. Yes, but how sure are you that it is written exactly the same everywhere, hmmm? Now multiply that with thousands of files... o.O
And then there's .htaccess. What happens if your code is distributed onto sites where the administrator is not so scrupulous? If you rely only on .htaccess to secure your files you're also going to need a) a backup, b) a box of tissues to dry your tears, c) a fire extinguisher to put out the flames in all the hatemail from people using your code.
So I know the question asks for the "easiest", but I think what this calls for is more "defensive coding".
What I suggest is:
Before any of your scripts require('ifyoulieyougonnadie.php'); (not include() and as a replacement for defined or die)
In ifyoulieyougonnadie.php, do some logic stuff - check for different constants, calling script, localhost testing and such - and then implement your die(), throw new Exception, 403, etc.
I am creating my own framework with two possible entry points - the main index.php (Joomla framework) and ajaxrouter.php (my framework) - so depending on the point of entry, I check for different things. If the request to ifyoulieyougonnadie.php doesn't come from one of those two files, I know shenanigans are being undertaken!
But what if I add a new entry point? No worries. I just change ifyoulieyougonnadie.php and I'm sorted, plus no 'find and replace'. Hooray!
What if I decided to move some of my scripts to do a different framework that doesn't have the same constants defined()? ... Hooray! ^_^
I found this strategy makes development a lot more fun and a lot less:
/**
* Hmmm... why is my netbeans debugger only showing a blank white page
* for this script (that is being tested outside the framework)?
* Later... I just don't understand why my code is not working...
* Much later... There are no error messages or anything!
* Why is it not working!?!
* I HATE PHP!!!
*
* Scroll back to the top of my 100s of lines of code...
* U_U
*
* Sorry PHP. I didn't mean what I said. I was just upset.
*/
// defined('_JEXEC') or die();
class perfectlyWorkingCode {}
perfectlyWorkingCode::nowDoingStuffBecauseIRememberedToCommentOutTheDie();
The easiest way is to set some variable in the file that calls include, such as
$including = true;
Then in the file that's being included, check for the variable
if (!$including) exit("direct access not permitted");
Besides the .htaccess way, I have seen a useful pattern in various frameworks, for example in ruby on rails. They have a separate pub/ directory in the application root directory and the library directories are living in directories at the same level as pub/. Something like this (not ideal, but you get the idea):
app/
|
+--pub/
|
+--lib/
|
+--conf/
|
+--models/
|
+--views/
|
+--controllers/
You set up your web server to use pub/ as document root. This offers better protection to your scripts: while they can reach out from the document root to load necessary components it is impossible to access the components from the internet. Another benefit besides security is that everything is in one place.
This setup is better than just creating checks in every single included file because "access not permitted" message is a clue to attackers, and it is better than .htaccess configuration because it is not white-list based: if you screw up the file extensions it will not be visible in the lib/, conf/ etc. directories.
What Joomla! does is defining a Constant in a root file and checking if the same is defined in the included files.
defined('_JEXEC') or die('Restricted access');
or else
one can keep all files outside the reach of an http request by placing them outside the webroot directory as most frameworks like CodeIgniter recommend.
or even by placing an .htaccess file within the include folder and writing rules, you can prevent direct access.
<?php
$url = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REQUEST_URI'];
if (false !== strpos($url,'.php')) {
die ("Direct access not premitted");
}
?>
If more precisely, you should use this condition:
if (array_search(__FILE__, get_included_files()) === 0) {
echo 'direct access';
}
else {
echo 'included';
}
get_included_files() returns indexed array containing names of all included files (if file is beign executed then it was included and its name is in the array).
So, when the file is directly accessed, its name is the first in the array, all other files in the array were included.
Storing your include files outside the web accessible directory has been mentioned a few times, and is certainly a good strategy where possible. However, another option I have not yet seen mentioned: ensure that your include files don’t contain any runnable code. If your include files merely define functions and classes, and have no code other than that, they will simply produce a blank page when accessed directly.
By all means allow direct access to this file from the browser: it won’t do anything. It defines some functions, but none of them are called, so none of them run.
<?php
function a() {
// function body
}
function b() {
// function body
}
The same applies to files which contain only PHP classes, and nothing else.
It’s still a good idea to keep your files outside of the web directory where possible.
You might accidentally deactivate PHP, in which case your server may send content of the PHP files to the browser, instead of running PHP and sending the result. This could result in your code (including database passwords, API keys, etc.) leaking.
Files in the web directory are squatting on URLs you may want to use for your app. I work with a CMS which cannot have a page called system, because that would conflict with a path used for code. I find this annoying.
Do something like:
<?php
if ($_SERVER['SCRIPT_FILENAME'] == '<path to php include file>') {
header('HTTP/1.0 403 Forbidden');
exit('Forbidden');
}
?>
<?php
if (eregi("YOUR_INCLUDED_PHP_FILE_NAME", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
?>
place the code above in the top of your included php file.
ex:
<?php
if (eregi("some_functions.php", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
// do something
?>
The following code is used in the Flatnux CMS (http://flatnux.altervista.org):
if ( strpos(strtolower($_SERVER['SCRIPT_NAME']),strtolower(basename(__FILE__))) )
{
header("Location: ../../index.php");
die("...");
}
I found this php-only and invariable solution which works both with http and cli :
Define a function :
function forbidDirectAccess($file) {
$self = getcwd()."/".trim($_SERVER["PHP_SELF"], "/");
(substr_compare($file, $self, -strlen($self)) != 0) or die('Restricted access');
}
Call the function in the file you want to prevent direct access to :
forbidDirectAccess(__FILE__);
Most of the solutions given above to this question do not work in Cli mode.
if (basename($_SERVER['PHP_SELF']) == basename(__FILE__)) { die('Access denied'); };
You can use the following method below although, it does have a flaw, because it can be faked, except if you can add another line of code to make sure the request comes only from your server either by using Javascript.
You can place this code in the Body section of your HTML code, so the error shows there.
<?
if(!isset($_SERVER['HTTP_REQUEST'])) { include ('error_file.php'); }
else { ?>
Place your other HTML code here
<? } ?>
End it like this, so the output of the error will always show within the body section, if that's how you want it to be.
i suggest that don't use of $_SERVER for security reasons .
You can use a variable like $root=true; in first file that included another one.
and use isset($root) in begin of second file that be included.
What you can also do is password protect the directory and keep all your php scripts in there, ofcourse except the index.php file, as at the time of include password won't be required as it will be required only for http access. what it will do is also provide you the option to access your scripts in case you want it as you will have password to access that directory. you will need to setup .htaccess file for the directory and a .htpasswd file to authenticate the user.
well, you can also use any of the solutions provided above in case you feel you don't need to access those files normally because you can always access them through cPanel etc.
Hope this helps
The easiest way is to store your includes outside of the web directory. That way the server has access to them but no outside machine. The only down side is you need to be able to access this part of your server. The upside is it requires no set up, configuration, or additional code/server stress.
I didn't find the suggestions with .htaccess so good because it may block
other content in that folder which you might want to allow user to access to,
this is my solution:
$currentFileInfo = pathinfo(__FILE__);
$requestInfo = pathinfo($_SERVER['REQUEST_URI']);
if($currentFileInfo['basename'] == $requestInfo['basename']){
// direct access to file
}
Earlier mentioned solution with PHP version check added:
$max_includes = version_compare(PHP_VERSION, '5', '<') ? 0 : 1;
if (count(get_included_files()) <= $max_includes)
{
exit('Direct access is not allowed.');
}
You can use phpMyAdmin Style:
/**
* block attempts to directly run this script
*/
if (getcwd() == dirname(__FILE__)) {
die('Attack stopped');
}

Php script to upload photos for single user and protect them [duplicate]

I have a php file which I will be using as exclusively as an include. Therefore I would like to throw an error instead of executing it when it's accessed directly by typing in the URL instead of being included.
Basically I need to do a check as follows in the php file:
if ( $REQUEST_URL == $URL_OF_CURRENT_PAGE ) die ("Direct access not premitted");
Is there an easy way to do this?
Add this to the page that you want to only be included
<?php
if(!defined('MyConst')) {
die('Direct access not permitted');
}
?>
then on the pages that include it add
<?php
define('MyConst', TRUE);
?>
The easiest way for the generic "PHP app running on an Apache server that you may or may not fully control" situation is to put your includes in a directory and deny access to that directory in your .htaccess file. To save people the trouble of Googling, if you're using Apache, put this in a file called ".htaccess" in the directory you don't want to be accessible:
Deny from all
If you actually have full control of the server (more common these days even for little apps than when I first wrote this answer), the best approach is to stick the files you want to protect outside of the directory that your web server is serving from. So if your app is in /srv/YourApp/, set the server to serve files from /srv/YourApp/app/ and put the includes in /srv/YourApp/includes, so there literally isn't any URL that can access them.
I have a file that I need to act differently when it's included vs when it's accessed directly (mainly a print() vs return()) Here's some modified code:
if(count(get_included_files()) ==1) exit("Direct access not permitted.");
The file being accessed is always an included file, hence the == 1.
1: Checking the count of included files
if( count(get_included_files()) == ((version_compare(PHP_VERSION, '5.0.0', '>='))?1:0) )
{
exit('Restricted Access');
}
Logic: PHP exits if the minimum include count isn't met. Note that prior to PHP5, the base page is not considered an include.
2: Defining and verifying a global constant
// In the base page (directly accessed):
define('_DEFVAR', 1);
// In the include files (where direct access isn't permitted):
defined('_DEFVAR') or exit('Restricted Access');
Logic: If the constant isn't defined, then the execution didn't start from the base page, and PHP would stop executing.
Note that for the sake of portability across upgrades and future changes, making this authentication method modular would significantly reduce the coding overhead as the changes won't need to be hard-coded to every single file.
// Put the code in a separate file instead, say 'checkdefined.php':
defined('_DEFVAR') or exit('Restricted Access');
// Replace the same code in the include files with:
require_once('checkdefined.php');
This way additional code can be added to checkdefined.php for logging and analytical purposes, as well as for generating appropriate responses.
Credit where credit is due: The brilliant idea of portability came from this answer. However there is one con to this method. Files in different folders may require different addresses to address this file. And server root based addressing may not work if you're running the current website from within a subfolder of the main site.
3: Remote address authorisation
// Call the include from the base page(directly accessed):
$includeData = file_get_contents("http://127.0.0.1/component.php?auth=token");
// In the include files (where direct access isn't permitted):
$src = $_SERVER['REMOTE_ADDR']; // Get the source address
$auth = authoriseIP($src); // Authorisation algorithm
if( !$auth ) exit('Restricted Access');
The drawback with this method is isolated execution, unless a session-token provided with the internal request. Verify via the loop-back address in case of a single server configuration, or an address white-list for a multi-server or load-balanced server infrastructure.
4: Token authorisation
Similar to the previous method, one can use GET or POST to pass an authorization token to the include file:
if($key!="serv97602"){header("Location: ".$dart);exit();}
A very messy method, but also perhaps the most secure and versatile at the same time, when used in the right way.
5: Webserver specific configuration
Most servers allow you to assign permissions for individual files or directories. You could place all your includes in such restricted directories, and have the server configured to deny them.
For example in APACHE, the configuration is stored in the .htaccess file. Tutorial here.
Note however that server-specific configurations are not recommended by me because they are bad for portability across different web-servers. In cases like Content Management Systems where the deny-algorithm is complex or the list of denied directories is rather big, it might only make reconfiguration sessions rather gruesome. In the end it's best to handle this in code.
6: Placing includes in a secure directory OUTSIDE the site root
Least preferred because of access limitations in server environments, but a rather powerful method if you have access to the file-system.
//Your secure dir path based on server file-system
$secure_dir=dirname($_SERVER['DOCUMENT_ROOT']).DIRECTORY_SEPARATOR."secure".DIRECTORY_SEPARATOR;
include($secure_dir."securepage.php");
Logic:
The user cannot request any file outside the htdocs folder as the links would be outside the scope of the website's address system.
The php server accesses the file-system natively, and hence can access files on a computer just like how a normal program with required privileges can.
By placing the include files in this directory, you can ensure that the php server gets to access them, while hotlinking is denied to the user.
Even if the webserver's filesystem access configuration wasn't done properly, this method would prevent those files from becoming public accidentally.
Please excuse my unorthodox coding conventions. Any feedback is appreciated.
The best way to prevent direct access to files is to place them outside of the web-server document root (usually, one level above). You can still include them, but there is no possibility of someone accessing them through an http request.
I usually go all the way, and place all of my PHP files outside of the document root aside from the bootstrap file - a lone index.php in the document root that starts routing the entire website/application.
An alternative (or complement) to Chuck's solution would be to deny access to files matching a specific pattern by putting something like this in your .htaccess file
<FilesMatch "\.(inc)$">
Order deny,allow
Deny from all
</FilesMatch>
Actually my advice is to do all of these best practices.
Put the documents outside the webroot OR in a directory denied access by the webserver
AND
Use a define in your visible documents that the hidden documents check for:
if (!defined(INCL_FILE_FOO)) {
header('HTTP/1.0 403 Forbidden');
exit;
}
This way if the files become misplaced somehow (an errant ftp operation) they are still protected.
I had this problem once, solved with:
if (strpos($_SERVER['REQUEST_URI'], basename(__FILE__)) !== false) ...
but the ideal solution is to place the file outside of the web-server document root, as mentioned in another anwser.
I wanted to restrict access to the PHP file directly, but also be able to call it via jQuery $.ajax (XMLHttpRequest). Here is what worked for me.
if (empty($_SERVER["HTTP_X_REQUESTED_WITH"]) && $_SERVER["HTTP_X_REQUESTED_WITH"] != "XMLHttpRequest") {
if (realpath($_SERVER["SCRIPT_FILENAME"]) == __FILE__) { // direct access denied
header("Location: /403");
exit;
}
}
You'd better build application with one entrance point, i.e. all files should be reached from index.php
Place this in index.php
define(A,true);
This check should run in each linked file (via require or include)
defined('A') or die(header('HTTP/1.0 403 Forbidden'));
debug_backtrace() || die ("Direct access not permitted");
My answer is somewhat different in approach but includes many of the answers provided here. I would recommend a multipronged approach:
.htaccess and Apache restrictions for sure
defined('_SOMECONSTANT') or die('Hackers! Be gone!');
HOWEVER the defined or die approach has a number of failings. Firstly, it is a real pain in the assumptions to test and debug with. Secondly, it involves horrifyingly, mind-numbingly boring refactoring if you change your mind. "Find and replace!" you say. Yes, but how sure are you that it is written exactly the same everywhere, hmmm? Now multiply that with thousands of files... o.O
And then there's .htaccess. What happens if your code is distributed onto sites where the administrator is not so scrupulous? If you rely only on .htaccess to secure your files you're also going to need a) a backup, b) a box of tissues to dry your tears, c) a fire extinguisher to put out the flames in all the hatemail from people using your code.
So I know the question asks for the "easiest", but I think what this calls for is more "defensive coding".
What I suggest is:
Before any of your scripts require('ifyoulieyougonnadie.php'); (not include() and as a replacement for defined or die)
In ifyoulieyougonnadie.php, do some logic stuff - check for different constants, calling script, localhost testing and such - and then implement your die(), throw new Exception, 403, etc.
I am creating my own framework with two possible entry points - the main index.php (Joomla framework) and ajaxrouter.php (my framework) - so depending on the point of entry, I check for different things. If the request to ifyoulieyougonnadie.php doesn't come from one of those two files, I know shenanigans are being undertaken!
But what if I add a new entry point? No worries. I just change ifyoulieyougonnadie.php and I'm sorted, plus no 'find and replace'. Hooray!
What if I decided to move some of my scripts to do a different framework that doesn't have the same constants defined()? ... Hooray! ^_^
I found this strategy makes development a lot more fun and a lot less:
/**
* Hmmm... why is my netbeans debugger only showing a blank white page
* for this script (that is being tested outside the framework)?
* Later... I just don't understand why my code is not working...
* Much later... There are no error messages or anything!
* Why is it not working!?!
* I HATE PHP!!!
*
* Scroll back to the top of my 100s of lines of code...
* U_U
*
* Sorry PHP. I didn't mean what I said. I was just upset.
*/
// defined('_JEXEC') or die();
class perfectlyWorkingCode {}
perfectlyWorkingCode::nowDoingStuffBecauseIRememberedToCommentOutTheDie();
The easiest way is to set some variable in the file that calls include, such as
$including = true;
Then in the file that's being included, check for the variable
if (!$including) exit("direct access not permitted");
Besides the .htaccess way, I have seen a useful pattern in various frameworks, for example in ruby on rails. They have a separate pub/ directory in the application root directory and the library directories are living in directories at the same level as pub/. Something like this (not ideal, but you get the idea):
app/
|
+--pub/
|
+--lib/
|
+--conf/
|
+--models/
|
+--views/
|
+--controllers/
You set up your web server to use pub/ as document root. This offers better protection to your scripts: while they can reach out from the document root to load necessary components it is impossible to access the components from the internet. Another benefit besides security is that everything is in one place.
This setup is better than just creating checks in every single included file because "access not permitted" message is a clue to attackers, and it is better than .htaccess configuration because it is not white-list based: if you screw up the file extensions it will not be visible in the lib/, conf/ etc. directories.
What Joomla! does is defining a Constant in a root file and checking if the same is defined in the included files.
defined('_JEXEC') or die('Restricted access');
or else
one can keep all files outside the reach of an http request by placing them outside the webroot directory as most frameworks like CodeIgniter recommend.
or even by placing an .htaccess file within the include folder and writing rules, you can prevent direct access.
<?php
$url = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REQUEST_URI'];
if (false !== strpos($url,'.php')) {
die ("Direct access not premitted");
}
?>
If more precisely, you should use this condition:
if (array_search(__FILE__, get_included_files()) === 0) {
echo 'direct access';
}
else {
echo 'included';
}
get_included_files() returns indexed array containing names of all included files (if file is beign executed then it was included and its name is in the array).
So, when the file is directly accessed, its name is the first in the array, all other files in the array were included.
Storing your include files outside the web accessible directory has been mentioned a few times, and is certainly a good strategy where possible. However, another option I have not yet seen mentioned: ensure that your include files don’t contain any runnable code. If your include files merely define functions and classes, and have no code other than that, they will simply produce a blank page when accessed directly.
By all means allow direct access to this file from the browser: it won’t do anything. It defines some functions, but none of them are called, so none of them run.
<?php
function a() {
// function body
}
function b() {
// function body
}
The same applies to files which contain only PHP classes, and nothing else.
It’s still a good idea to keep your files outside of the web directory where possible.
You might accidentally deactivate PHP, in which case your server may send content of the PHP files to the browser, instead of running PHP and sending the result. This could result in your code (including database passwords, API keys, etc.) leaking.
Files in the web directory are squatting on URLs you may want to use for your app. I work with a CMS which cannot have a page called system, because that would conflict with a path used for code. I find this annoying.
Do something like:
<?php
if ($_SERVER['SCRIPT_FILENAME'] == '<path to php include file>') {
header('HTTP/1.0 403 Forbidden');
exit('Forbidden');
}
?>
<?php
if (eregi("YOUR_INCLUDED_PHP_FILE_NAME", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
?>
place the code above in the top of your included php file.
ex:
<?php
if (eregi("some_functions.php", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
// do something
?>
The following code is used in the Flatnux CMS (http://flatnux.altervista.org):
if ( strpos(strtolower($_SERVER['SCRIPT_NAME']),strtolower(basename(__FILE__))) )
{
header("Location: ../../index.php");
die("...");
}
I found this php-only and invariable solution which works both with http and cli :
Define a function :
function forbidDirectAccess($file) {
$self = getcwd()."/".trim($_SERVER["PHP_SELF"], "/");
(substr_compare($file, $self, -strlen($self)) != 0) or die('Restricted access');
}
Call the function in the file you want to prevent direct access to :
forbidDirectAccess(__FILE__);
Most of the solutions given above to this question do not work in Cli mode.
if (basename($_SERVER['PHP_SELF']) == basename(__FILE__)) { die('Access denied'); };
You can use the following method below although, it does have a flaw, because it can be faked, except if you can add another line of code to make sure the request comes only from your server either by using Javascript.
You can place this code in the Body section of your HTML code, so the error shows there.
<?
if(!isset($_SERVER['HTTP_REQUEST'])) { include ('error_file.php'); }
else { ?>
Place your other HTML code here
<? } ?>
End it like this, so the output of the error will always show within the body section, if that's how you want it to be.
i suggest that don't use of $_SERVER for security reasons .
You can use a variable like $root=true; in first file that included another one.
and use isset($root) in begin of second file that be included.
What you can also do is password protect the directory and keep all your php scripts in there, ofcourse except the index.php file, as at the time of include password won't be required as it will be required only for http access. what it will do is also provide you the option to access your scripts in case you want it as you will have password to access that directory. you will need to setup .htaccess file for the directory and a .htpasswd file to authenticate the user.
well, you can also use any of the solutions provided above in case you feel you don't need to access those files normally because you can always access them through cPanel etc.
Hope this helps
The easiest way is to store your includes outside of the web directory. That way the server has access to them but no outside machine. The only down side is you need to be able to access this part of your server. The upside is it requires no set up, configuration, or additional code/server stress.
I didn't find the suggestions with .htaccess so good because it may block
other content in that folder which you might want to allow user to access to,
this is my solution:
$currentFileInfo = pathinfo(__FILE__);
$requestInfo = pathinfo($_SERVER['REQUEST_URI']);
if($currentFileInfo['basename'] == $requestInfo['basename']){
// direct access to file
}
Earlier mentioned solution with PHP version check added:
$max_includes = version_compare(PHP_VERSION, '5', '<') ? 0 : 1;
if (count(get_included_files()) <= $max_includes)
{
exit('Direct access is not allowed.');
}
You can use phpMyAdmin Style:
/**
* block attempts to directly run this script
*/
if (getcwd() == dirname(__FILE__)) {
die('Attack stopped');
}

Difference between $_SERVER['DOCUMENT_ROOT'] and $_SERVER['HTTP_HOST']

I am back with a simple question (or related question).
The question is simple however I have not received an answer yet. I have asked many people with different experience in PHP. But the response I get is: "I don't have any idea. I've never thought about that." Using Google I have not been able to find any article on this. I hope that I will get a satisfying answer here.
So the question is:
What is the difference between $_SERVER['DOCUMENT_ROOT'] and $_SERVER['HTTP_HOST'] ?
Are there any advantages of one over the other?
Where should we use HTTP_HOST & where to use DOCUMENT_ROOT?
DOCUMENT_ROOT
The root directory of this site defined by the 'DocumentRoot' directive in the General Section or a section e.g.
DOCUMENT_ROOT=/var/www/example
HTTP_HOST
The base URL of the host e.g.
HTTP_HOST=www.example.com
The document root is the local path to your website, on your server; The http host is the hostname of the server. They are rather different; perhaps you can clarify your question?
Edit:
You said:
Case 1 : header('Location: '. $_SERVER['DOCUMENT_ROOT'] . '/abc.php')
Case 2: header('Location: '. $_SERVER['HTTP_HOST'] . '/abc.php')
I suspect the first is only going to work if you run your browser on the same machine that's serving the pages.
Imagine if someone else visits your website, using their Windows machine. And your webserver tells them in the HTTP headers, "hey, actually, redirect this location: /var/www/example/abc.php." What do you expect the user's machine to do?
Now, if you're talking about something like
<?php include($_SERVER['DOCUMENT_ROOT'] . '/include/abc.php') ?>
vs
<?php include($_SERVER['HTTP_HOST'] . '/include/abc.php') ?>
That might make sense. I suspect in this case the former is probably preferred, although I am not a PHP Guru.
<?php include($_SERVER['DOCUMENT_ROOT'] . '/include/abc.php') ?>
should be used for including the files in another file.
header('Location: '. $_SERVER['HTTP_HOST'] . '/abc.php')
should be used for hyperlinking
Eh, what's the question? DOCUMENT_ROOT contains the path to current web, in my case /home/www. HTTP_HOST contains testing.local, as it runs on local domain. The difference is obvious, isn't it?
I cannot figure out where you could interchange those two, so why should you consider advantages?
HTTP_HOST will give you URL of the host, e.g. domain.com
DOCUMENT_ROOT will give you absolute path to document root of the website in server's file system, e.g. /var/www/domain/
Btw, have you tried looking at PHP's manual, specifically $_SERVER? Everything is explanied there.
if you want domain path like 'example.com', you can use "HTTP_HOST"
if you want folder '/public_html/foldername/' path you can use
"DOCUMENT_ROOT"
$_SERVER ['HTTP_HOST'] is defined by the client and may not even be set! You can repeat a request and withhold the header for local testing in developer tools such as for Waterfox/Firefox. You must determine if this header is set and if the host being requested exists (one of the very first things you do, even before starting to send any of your headers) otherwise the appropriate action is to kill the entire process and respond with an HTTP 400 Bad Request. This goes for all server-side programming languages.
$_SERVER['DOCUMENT_ROOT'] is defined by the server as the directory which the executing script is located. Examples:
public_html/example.php = public_html/
public_html/test1/example.php = public_html/test1/
Keep in mind that if you're using Apache rewrites that there is a difference between the $_SERVER['REQUEST_URI'] (the URL requested) and $_SERVER['PHP_SELF'] (the file handling the request).
The Title question is perfectly awnsered by John Ledbetter.
This awnser is intended to expand and offer additional information about what seems to be the original poster inner concerns:
Where would make sense to use the URL based location: $_SERVER['HTTP_HOST'] ?
Where would make sense to use the local based location: $_SERVER['DOCUMENT_ROOT'] ?
Where both can be used, what are the Advantages and Disadvantages of each one. ?
Following my awnsers:
By usign the HTTP_HOST you can abstract yourself from the machine Folder System which means in cases where portability is a concern and you are expected to install the Application on multiple servers potentially with diferent OS this approach could be easier to maintain.
You can also take advantage of HTTP_HOST if your server is going to become unavailible and you want a diferent one from the cluster to handle the request.
By Using the DOCUMENT_ROOT you can access the whole filesystem (depends on the permissions you give to php) it makes sense if you want to access a program which you dont want to be accesible from the web or when the Folder System is relevant to your Application.
You can also take advantage of DOCUMENT_ROOT to get the subsite root instead of the Host.
$_SERVER['HTTP_HOST'] = "www.example.com";
$_SERVER['DOCUMENT_ROOT'] = "var/www/domain/subsite1" // equivalent to www.example.com/subsite1
$_SERVER ['HTTP_HOST'] returns the domain url
a.g. www.example.com
While $_SERVER['DOCUMENT_ROOT'] returns the roof of current web..
Such as
Other answers have alluded to it, but I wanted to add an answer just to be sharp as a grizzly bear tooth in one point - don't trust $_SERVER['HTTP_HOST'] as safe where following code does:
<?php
header('Location: '. $_SERVER['HTTP_HOST'] . '/abc.php');
#Or
include($_SERVER['HTTP_HOST'] . '/include/abc.php');
?>
The variable is subject to manipulation by the incoming request and could contribute to an exploit. This may depend on your server configuration, but you don't want something filling out this variable for you :)
See also:
https://security.stackexchange.com/questions/32299/is-server-a-safe-source-of-data-in-php
https://expressionengine.com/blog/http-host-and-server-name-security-issues

To convert an absolute path to a relative path in php

I would like to convert an absolute path into a relative path.
This is what the current absolute code looks like
$sitefolder = "/wmt/";
$adminfolder = "/wmt/admin/";
$site_path = $_SERVER["DOCUMENT_ROOT"]."$sitefolder";
// $site_path ="//winam/refiller/";
$admin_path = $site_path . "$adminfolder";
$site_url = "http://".$_SERVER["HTTP_HOST"]."$sitefolder";
$admin_url = $site_url . "$adminfolder";
$site_images = $site_url."images/";
so for example, the code above would give you a site url of
www.temiremi.com/wmt
and accessing a file in that would give
www.temiremi.com/wmt/folder1.php
What I want to do is this I want to mask the temiremi.com/wmt and replace it with dolapo.com, so it would say www.dolapo.com/folder1.php
Is it possible to do that with relative path.
I'm a beginner coder. I paid someone to do something for me, but I want to get into doing it myself now.
The problem is that your question, although it seems very specific, is missing some crucial details.
If the script you posted is always being executed, and you always want it to go to delapo.com instead of temiremi.com, then all you would have to do is replace
$site_url = "http://".$_SERVER["HTTP_HOST"]."$sitefolder";
with
$site_url = "http://www.delapo.com/$sitefolder";
The $_SERVER["HTTP_HOST"] variable will return the domain for whatever site was requested. Therefore, if the user goes to www.temiremi.com/myscript.php (assuming that the script you posted is saved in a file called myscript.php) then $_SERVER["HTTP_HOST"] just returns www.temiremi.com.
On the other hand, you may not always be redirecting to the same domain or you may want the script to be able to adapt easily to go to different domains without having to dig through layers of code in the future. If this is the case, then you will need a way to figuring out what domain you wish to link to.
If you have a website hosted on temiremi.com but you want it to look like you are accessing from delapo.com, this is not an issue that can be resolved by PHP. You would have to have delapo.com redirect to temiremi.com or simply host on delapo.com in the first place.
If the situation is the other way around and you want a website hosted on delapo.com but you want users to access temiremi.com, then simply re-writing links isn't a sophisticated enough answer. This strategy would redirect the user to the other domain when they clicked the link. Instead you would need to have a proxy set up to forward the information. Proxy scripts vary in complexity, but the simplest one would be something like:
<?php
$site = file_get_contents("http://www.delapo.com/$sitefolder");
echo $site;
?>
So you see, we really need a little more information on why you need this script and its intended purpose in order to assist you.
This would be a lot easier to do in the HTTP server configuration. For example, using Apache's VHost
I'm not really sure what you're going for bc this doesnt look like absolute path to relative path, but rather one absolute path to another.
Are you always trying to simply change "www.temiremi.com/wmt/" to "delapo.com"? If thats the case, you just want simple string replacement rather than $_SERVER variables or path functions.
$alteredPath = str_replace("www.temiremi.com/wmt/", "delapo.com", $oldPath);
OR
$alteredParth "www.delapo.com/" . basename($oldPath)
If i misunderstand please explain, I don't know if you need this to be more robust/generic, and you kind of threw me for a loop with "dolapo.com" (when i first thought your title, i originally thought of comparing path to a value from $_SERVER and removing common parts,)
And as mentioned, if you are just trying to make the URL displayed the the user (in the address bar or links) look different PHP can't do this.

Prevent direct access to a php include file

I have a php file which I will be using as exclusively as an include. Therefore I would like to throw an error instead of executing it when it's accessed directly by typing in the URL instead of being included.
Basically I need to do a check as follows in the php file:
if ( $REQUEST_URL == $URL_OF_CURRENT_PAGE ) die ("Direct access not premitted");
Is there an easy way to do this?
Add this to the page that you want to only be included
<?php
if(!defined('MyConst')) {
die('Direct access not permitted');
}
?>
then on the pages that include it add
<?php
define('MyConst', TRUE);
?>
The easiest way for the generic "PHP app running on an Apache server that you may or may not fully control" situation is to put your includes in a directory and deny access to that directory in your .htaccess file. To save people the trouble of Googling, if you're using Apache, put this in a file called ".htaccess" in the directory you don't want to be accessible:
Deny from all
If you actually have full control of the server (more common these days even for little apps than when I first wrote this answer), the best approach is to stick the files you want to protect outside of the directory that your web server is serving from. So if your app is in /srv/YourApp/, set the server to serve files from /srv/YourApp/app/ and put the includes in /srv/YourApp/includes, so there literally isn't any URL that can access them.
I have a file that I need to act differently when it's included vs when it's accessed directly (mainly a print() vs return()) Here's some modified code:
if(count(get_included_files()) ==1) exit("Direct access not permitted.");
The file being accessed is always an included file, hence the == 1.
1: Checking the count of included files
if( count(get_included_files()) == ((version_compare(PHP_VERSION, '5.0.0', '>='))?1:0) )
{
exit('Restricted Access');
}
Logic: PHP exits if the minimum include count isn't met. Note that prior to PHP5, the base page is not considered an include.
2: Defining and verifying a global constant
// In the base page (directly accessed):
define('_DEFVAR', 1);
// In the include files (where direct access isn't permitted):
defined('_DEFVAR') or exit('Restricted Access');
Logic: If the constant isn't defined, then the execution didn't start from the base page, and PHP would stop executing.
Note that for the sake of portability across upgrades and future changes, making this authentication method modular would significantly reduce the coding overhead as the changes won't need to be hard-coded to every single file.
// Put the code in a separate file instead, say 'checkdefined.php':
defined('_DEFVAR') or exit('Restricted Access');
// Replace the same code in the include files with:
require_once('checkdefined.php');
This way additional code can be added to checkdefined.php for logging and analytical purposes, as well as for generating appropriate responses.
Credit where credit is due: The brilliant idea of portability came from this answer. However there is one con to this method. Files in different folders may require different addresses to address this file. And server root based addressing may not work if you're running the current website from within a subfolder of the main site.
3: Remote address authorisation
// Call the include from the base page(directly accessed):
$includeData = file_get_contents("http://127.0.0.1/component.php?auth=token");
// In the include files (where direct access isn't permitted):
$src = $_SERVER['REMOTE_ADDR']; // Get the source address
$auth = authoriseIP($src); // Authorisation algorithm
if( !$auth ) exit('Restricted Access');
The drawback with this method is isolated execution, unless a session-token provided with the internal request. Verify via the loop-back address in case of a single server configuration, or an address white-list for a multi-server or load-balanced server infrastructure.
4: Token authorisation
Similar to the previous method, one can use GET or POST to pass an authorization token to the include file:
if($key!="serv97602"){header("Location: ".$dart);exit();}
A very messy method, but also perhaps the most secure and versatile at the same time, when used in the right way.
5: Webserver specific configuration
Most servers allow you to assign permissions for individual files or directories. You could place all your includes in such restricted directories, and have the server configured to deny them.
For example in APACHE, the configuration is stored in the .htaccess file. Tutorial here.
Note however that server-specific configurations are not recommended by me because they are bad for portability across different web-servers. In cases like Content Management Systems where the deny-algorithm is complex or the list of denied directories is rather big, it might only make reconfiguration sessions rather gruesome. In the end it's best to handle this in code.
6: Placing includes in a secure directory OUTSIDE the site root
Least preferred because of access limitations in server environments, but a rather powerful method if you have access to the file-system.
//Your secure dir path based on server file-system
$secure_dir=dirname($_SERVER['DOCUMENT_ROOT']).DIRECTORY_SEPARATOR."secure".DIRECTORY_SEPARATOR;
include($secure_dir."securepage.php");
Logic:
The user cannot request any file outside the htdocs folder as the links would be outside the scope of the website's address system.
The php server accesses the file-system natively, and hence can access files on a computer just like how a normal program with required privileges can.
By placing the include files in this directory, you can ensure that the php server gets to access them, while hotlinking is denied to the user.
Even if the webserver's filesystem access configuration wasn't done properly, this method would prevent those files from becoming public accidentally.
Please excuse my unorthodox coding conventions. Any feedback is appreciated.
The best way to prevent direct access to files is to place them outside of the web-server document root (usually, one level above). You can still include them, but there is no possibility of someone accessing them through an http request.
I usually go all the way, and place all of my PHP files outside of the document root aside from the bootstrap file - a lone index.php in the document root that starts routing the entire website/application.
An alternative (or complement) to Chuck's solution would be to deny access to files matching a specific pattern by putting something like this in your .htaccess file
<FilesMatch "\.(inc)$">
Order deny,allow
Deny from all
</FilesMatch>
Actually my advice is to do all of these best practices.
Put the documents outside the webroot OR in a directory denied access by the webserver
AND
Use a define in your visible documents that the hidden documents check for:
if (!defined(INCL_FILE_FOO)) {
header('HTTP/1.0 403 Forbidden');
exit;
}
This way if the files become misplaced somehow (an errant ftp operation) they are still protected.
I had this problem once, solved with:
if (strpos($_SERVER['REQUEST_URI'], basename(__FILE__)) !== false) ...
but the ideal solution is to place the file outside of the web-server document root, as mentioned in another anwser.
I wanted to restrict access to the PHP file directly, but also be able to call it via jQuery $.ajax (XMLHttpRequest). Here is what worked for me.
if (empty($_SERVER["HTTP_X_REQUESTED_WITH"]) && $_SERVER["HTTP_X_REQUESTED_WITH"] != "XMLHttpRequest") {
if (realpath($_SERVER["SCRIPT_FILENAME"]) == __FILE__) { // direct access denied
header("Location: /403");
exit;
}
}
You'd better build application with one entrance point, i.e. all files should be reached from index.php
Place this in index.php
define(A,true);
This check should run in each linked file (via require or include)
defined('A') or die(header('HTTP/1.0 403 Forbidden'));
debug_backtrace() || die ("Direct access not permitted");
My answer is somewhat different in approach but includes many of the answers provided here. I would recommend a multipronged approach:
.htaccess and Apache restrictions for sure
defined('_SOMECONSTANT') or die('Hackers! Be gone!');
HOWEVER the defined or die approach has a number of failings. Firstly, it is a real pain in the assumptions to test and debug with. Secondly, it involves horrifyingly, mind-numbingly boring refactoring if you change your mind. "Find and replace!" you say. Yes, but how sure are you that it is written exactly the same everywhere, hmmm? Now multiply that with thousands of files... o.O
And then there's .htaccess. What happens if your code is distributed onto sites where the administrator is not so scrupulous? If you rely only on .htaccess to secure your files you're also going to need a) a backup, b) a box of tissues to dry your tears, c) a fire extinguisher to put out the flames in all the hatemail from people using your code.
So I know the question asks for the "easiest", but I think what this calls for is more "defensive coding".
What I suggest is:
Before any of your scripts require('ifyoulieyougonnadie.php'); (not include() and as a replacement for defined or die)
In ifyoulieyougonnadie.php, do some logic stuff - check for different constants, calling script, localhost testing and such - and then implement your die(), throw new Exception, 403, etc.
I am creating my own framework with two possible entry points - the main index.php (Joomla framework) and ajaxrouter.php (my framework) - so depending on the point of entry, I check for different things. If the request to ifyoulieyougonnadie.php doesn't come from one of those two files, I know shenanigans are being undertaken!
But what if I add a new entry point? No worries. I just change ifyoulieyougonnadie.php and I'm sorted, plus no 'find and replace'. Hooray!
What if I decided to move some of my scripts to do a different framework that doesn't have the same constants defined()? ... Hooray! ^_^
I found this strategy makes development a lot more fun and a lot less:
/**
* Hmmm... why is my netbeans debugger only showing a blank white page
* for this script (that is being tested outside the framework)?
* Later... I just don't understand why my code is not working...
* Much later... There are no error messages or anything!
* Why is it not working!?!
* I HATE PHP!!!
*
* Scroll back to the top of my 100s of lines of code...
* U_U
*
* Sorry PHP. I didn't mean what I said. I was just upset.
*/
// defined('_JEXEC') or die();
class perfectlyWorkingCode {}
perfectlyWorkingCode::nowDoingStuffBecauseIRememberedToCommentOutTheDie();
The easiest way is to set some variable in the file that calls include, such as
$including = true;
Then in the file that's being included, check for the variable
if (!$including) exit("direct access not permitted");
Besides the .htaccess way, I have seen a useful pattern in various frameworks, for example in ruby on rails. They have a separate pub/ directory in the application root directory and the library directories are living in directories at the same level as pub/. Something like this (not ideal, but you get the idea):
app/
|
+--pub/
|
+--lib/
|
+--conf/
|
+--models/
|
+--views/
|
+--controllers/
You set up your web server to use pub/ as document root. This offers better protection to your scripts: while they can reach out from the document root to load necessary components it is impossible to access the components from the internet. Another benefit besides security is that everything is in one place.
This setup is better than just creating checks in every single included file because "access not permitted" message is a clue to attackers, and it is better than .htaccess configuration because it is not white-list based: if you screw up the file extensions it will not be visible in the lib/, conf/ etc. directories.
What Joomla! does is defining a Constant in a root file and checking if the same is defined in the included files.
defined('_JEXEC') or die('Restricted access');
or else
one can keep all files outside the reach of an http request by placing them outside the webroot directory as most frameworks like CodeIgniter recommend.
or even by placing an .htaccess file within the include folder and writing rules, you can prevent direct access.
<?php
$url = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REQUEST_URI'];
if (false !== strpos($url,'.php')) {
die ("Direct access not premitted");
}
?>
If more precisely, you should use this condition:
if (array_search(__FILE__, get_included_files()) === 0) {
echo 'direct access';
}
else {
echo 'included';
}
get_included_files() returns indexed array containing names of all included files (if file is beign executed then it was included and its name is in the array).
So, when the file is directly accessed, its name is the first in the array, all other files in the array were included.
Storing your include files outside the web accessible directory has been mentioned a few times, and is certainly a good strategy where possible. However, another option I have not yet seen mentioned: ensure that your include files don’t contain any runnable code. If your include files merely define functions and classes, and have no code other than that, they will simply produce a blank page when accessed directly.
By all means allow direct access to this file from the browser: it won’t do anything. It defines some functions, but none of them are called, so none of them run.
<?php
function a() {
// function body
}
function b() {
// function body
}
The same applies to files which contain only PHP classes, and nothing else.
It’s still a good idea to keep your files outside of the web directory where possible.
You might accidentally deactivate PHP, in which case your server may send content of the PHP files to the browser, instead of running PHP and sending the result. This could result in your code (including database passwords, API keys, etc.) leaking.
Files in the web directory are squatting on URLs you may want to use for your app. I work with a CMS which cannot have a page called system, because that would conflict with a path used for code. I find this annoying.
Do something like:
<?php
if ($_SERVER['SCRIPT_FILENAME'] == '<path to php include file>') {
header('HTTP/1.0 403 Forbidden');
exit('Forbidden');
}
?>
<?php
if (eregi("YOUR_INCLUDED_PHP_FILE_NAME", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
?>
place the code above in the top of your included php file.
ex:
<?php
if (eregi("some_functions.php", $_SERVER['PHP_SELF'])) {
die("<h4>You don't have right permission to access this file directly.</h4>");
}
// do something
?>
The following code is used in the Flatnux CMS (http://flatnux.altervista.org):
if ( strpos(strtolower($_SERVER['SCRIPT_NAME']),strtolower(basename(__FILE__))) )
{
header("Location: ../../index.php");
die("...");
}
I found this php-only and invariable solution which works both with http and cli :
Define a function :
function forbidDirectAccess($file) {
$self = getcwd()."/".trim($_SERVER["PHP_SELF"], "/");
(substr_compare($file, $self, -strlen($self)) != 0) or die('Restricted access');
}
Call the function in the file you want to prevent direct access to :
forbidDirectAccess(__FILE__);
Most of the solutions given above to this question do not work in Cli mode.
if (basename($_SERVER['PHP_SELF']) == basename(__FILE__)) { die('Access denied'); };
You can use the following method below although, it does have a flaw, because it can be faked, except if you can add another line of code to make sure the request comes only from your server either by using Javascript.
You can place this code in the Body section of your HTML code, so the error shows there.
<?
if(!isset($_SERVER['HTTP_REQUEST'])) { include ('error_file.php'); }
else { ?>
Place your other HTML code here
<? } ?>
End it like this, so the output of the error will always show within the body section, if that's how you want it to be.
i suggest that don't use of $_SERVER for security reasons .
You can use a variable like $root=true; in first file that included another one.
and use isset($root) in begin of second file that be included.
What you can also do is password protect the directory and keep all your php scripts in there, ofcourse except the index.php file, as at the time of include password won't be required as it will be required only for http access. what it will do is also provide you the option to access your scripts in case you want it as you will have password to access that directory. you will need to setup .htaccess file for the directory and a .htpasswd file to authenticate the user.
well, you can also use any of the solutions provided above in case you feel you don't need to access those files normally because you can always access them through cPanel etc.
Hope this helps
The easiest way is to store your includes outside of the web directory. That way the server has access to them but no outside machine. The only down side is you need to be able to access this part of your server. The upside is it requires no set up, configuration, or additional code/server stress.
I didn't find the suggestions with .htaccess so good because it may block
other content in that folder which you might want to allow user to access to,
this is my solution:
$currentFileInfo = pathinfo(__FILE__);
$requestInfo = pathinfo($_SERVER['REQUEST_URI']);
if($currentFileInfo['basename'] == $requestInfo['basename']){
// direct access to file
}
Earlier mentioned solution with PHP version check added:
$max_includes = version_compare(PHP_VERSION, '5', '<') ? 0 : 1;
if (count(get_included_files()) <= $max_includes)
{
exit('Direct access is not allowed.');
}
You can use phpMyAdmin Style:
/**
* block attempts to directly run this script
*/
if (getcwd() == dirname(__FILE__)) {
die('Attack stopped');
}

Categories