This question already has answers here:
Prevent direct access to a php include file
(33 answers)
Closed 9 years ago.
I am building a web app and I use am using a lot of include files to build most of my pages.
For example:
The page profile.php includes about.php, timeline.php, photos.php
Now, I want to send a user to my 404 page if they try to go to one of my include files directly. How can I do this?
Going to localhost/timeline.php should redirect the user to the 404.
My thought was to write an IF Statement in those include files that checks to see if the file is being opened directly, is that even possible?
Fast and dirty:
Define a constant inside of profile.php, and check if it exists inside of the included files:
if (!defined("SOME_CONSTANT")) {
//Redirect or send a 404 header
}
Slower but better
Get your files which have no meaning as a standalone file in a web context out of the web root (usually a www or htdocs folder).
You can use a simple if statement
if(!defined("SECURE")) {
header("HTTP/1.0 404 Not Found", true, 404);
include("404_Not_Found.html"); //include the 404 page
exit;
}
Making sure that on your included pages, you define the secure constant.
Define a constant in your front controller and check if it is set in the other files.
In your front controller at the top:
define("IN_APP");
Then in your other pages before any processing:
if(defined("IN_APP")) { die(header('HTTP/1.0 404 Not Found', true, 404)); }
A better option however is to put your other files in a subdirectory and deny access to it with a .htaccess file containing, the following code.
Order deny,allow
deny from all
This will still allow php to include the files, but will block anyone from accessing them from the web.
Related
I have a PHP block of code that I am using in multiple forms on my website. Instead I want to have one separate file which I then include in multiple times. I don't want it to be accessible on its own via path in website URL. I am thinking of creating a separate folder and redirecting from it to homesite via htaccess. Are there any other solutions? What is the best practice here?
You could use a function and shared library.
file1.php
<?php
require_once './myLib.php';
echo myFunc();
myLib.php
function myFunc() {
return "Foo";
}
expected output from file1.php
Foo
There are several ways of making sure this library cannot be accessed.
The easiest way is simply storing it outside of your webroot (perhaps in /var/www/script) and import it with ../script/myLib.php or /var/www/script/myLib.php
If you want to place it inside your webroot you could deny access to it with .htaccess
This question could help you with that How to deny access to a file in .htaccess
Finally, you could simply place it inside your webroot and let the script check if its being accessed directly with the following lines
if(__FILE__ === $_SERVER['SCRIPT_FILENAME']) {
header('location: /'); // if the script is accessed directly redirect the user
die(); // ALWAYS DIE AFTER A REDIRECT
}
The context:
I'm building a PHP 7 web application, it uses a PHP session to login and check to see if the user is logged in on each page. Here is the basic makeup of most pages:
<?php session_start(); ?>
<?php include 'the_header_and_menu.php'; ?>
<p>Page content</p>
<?php include 'the_footer.php'; ?>
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory. This PHP process does five checks, the most basic one included below.
if (!isset($_SESSION['loggedin']) || $_SESSION['loggedin'] == 'false') { // If loggedin is not set, or is false, then run this block.
header('Location: http://example.com/index?eject=noLogin'); // Send the user to the eject page.
die(); // Exit the process.
}
Process summary: User logs in, which creates a session and its variables. When the user loads a page, a session check is performed to make sure that the user's account is valid and authorised. If the account or session is no longer valid/authorised, then the user is redirected to the login page (index).
The issue: When someone who's not logged in enters http://example.com/dashboard, they are ejected using the first check (featured above). However, if they enter http://example.com/process/, the checks seem to count for nothing and the user is shown the page. This page does not just include a directory listing, but calls the http://example.com/process/index.php file to represent it instead.
The question: How can I apply the same logic that protects individual pages like dashboard.php, to the case of protecting directory indexes?
Own answer:
The issue here was one which was simple, but overlooked.
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory.
Within the header and menu file was the session check include. However, because the session check was located outside the main directory (like much of the back-end), I had referenced to it through a relative path, similar to the one below.
include_once '../mainfolder/php/subfolder/sessioncheck.php';
However, because the file was being included to a subdirectory, it should've included a further ../ operator.
include_once '../../safe/php/users/sessioncheck.php';
The solution: Instead of performing a session check through the header and menu, I am now including it on every page I want to protect. This is by no means a perfect solution and simply acts to get things working again.
Thank you to Daniel Schmidt, who got me looking in the right direction!
Directory indexes don't usually come from PHP - they are served by your webserver (nginx, apache, ..). Today, there is obviously no need to have that indexes enabled.
It looks like you're not sending each request to you're PHP process(es). I tend to suggest checking your webserver configuration.
The issue here was one which was simple, but overlooked.
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory.
Within the header and menu file was the session check include. However, because the session check was located outside the main directory (like much of the back-end), I had referenced to it through a relative path, similar to the one below.
include_once '../mainfolder/php/subfolder/sessioncheck.php';
However, because the file was being included to a subdirectory, it should've included a further ../ operator.
include_once '../../safe/php/users/sessioncheck.php';
The solution: Instead of performing a session check through the header and menu, I am now including it on every page I want to protect. This is by no means a perfect solution and simply acts to get things working again.
I am looking to find a way I can secure admin area, especially the folder itself from outside access (These include folders with images and css). I have read a lot of suggestions but they all feel rather a compromise or work around than a bullet proof method or I am not understanding which is best for security and hidden from outside world, I want to be the only one that knows about it or access it. Hoping someone can shed some light what they would use, when they want the area completely hidden from outside world, whilst still accessible to you.
Some of the methods I have come across involve:
Moving folder outside of root
Using Htaccess Deny all. (also means I can't login unless I apply a static IP address which I do not have)
Another way I thought of could be to use session variable to store admin, recognize and grant access based on session ID. (This does mean all other css files and image folders are viewable).
Adding an index page in the folder which I see alot of sites do.
I currently have my login script to redirect me to my admin area, so is there anyway for the whole folder to recognize it's me and grant access and serve files on if a logged in admin php file is requesting it?, if not to decline access including images and css etc?
Can't figure out how best to protect this area? Is using session a secure way of identifying an admin?
The easiest way to ensure content is not exposed to the web is to place it above the site folder in your directory structure.
so for example in your Apache configuration mount the site at
/var/www/sites/site/content/
and place the restricted content at
/var/www/sites/site/
that way the content will not be exposed but php can still read it if required.
Obviously this will not stop users from seeing what is in your css files if php reads them and echoes them out but I dont see why a css file should need to be secure
Edit
Supposing you have a folder on your server at /var/www/sites/site/content/some_folder
and you enter www.yoursite.com/some_folder into a browser, assuming you have indexes open in your site you will see a list of files in some_folder
But how can you get to /var/www/sites/site/ from a web brower ? ... you can't!!
but what you can do is some thing like this:
And this would be a php file inside the main site folder (visible to public)
<?php
session_start();
if(isset($_SESSION['admin_logged_in'])){
include '/var/www/sites/site/secret_content.php';
}
The first step would indeed be to move all files you want to prevent public access to to outside the document root. This way there is no way to access the files directly through your webserver.
If you are looking to prevent access for all resources (including images, scripts, stylesheets etc) you could implement a "proxy" which is responsible for serving the files (after checking whether the user is authorized).
The easiest and most flexible way to do this is to have a single entry point in the application. Using apache this can easily be achieved using the following rewrite rule:
RewriteEngine On
RewriteRule ^(.*)$ index.php [L,QSA]
This will make sure every request will go through your index.php file.
No you can easiy check whether you are allowed to access the resources using e.g.:
<?php
session_start();
if (!isset($_SESSION['user'])) {
header('HTTP/1.0 403 Forbidden');
exit; // important to prevent further execution of the script
}
// user is allowed access, do shit
The above is a very simplified example. Normally you may want to render an actual nice looking page telling the user he is not allowed to access you stuff / rendering a login page.
Now to render the protected resources you could do something like:
Directory structure
Project
public (docroot)
index.php
index.php
other protected files
index.php in docroot
<?php
require_once __DIR__ . '/../index.php';
index.php in project
<?php
session_start();
if (!isset($_SESSION['user'])) {
header('HTTP/1.0 403 Forbidden');
exit; // important to prevent further execution of the script
}
$file = $_SERVER['REQUEST_URI']; // important to sanitize or possible check against whitelist the requested resource
$ext = pathinfo($path, PATHINFO_EXTENSION);
switch ($ext) {
case 'jpg':
case 'jpeg':
header('Content-type: image/jpeg');
imagejpeg('/path/to/protected/resources/' . $file);
break;
}
Now you will lhave total control over what you serve and to whom.
Note that whether it is secure depends entirely on what your implementation looks like, but in general:
Always place your non public files outside of the document root
Always sanitize / whitelist user input
Always secure your data
Some generic, but related reads:
Preventing Directory Traversal in PHP but allowing paths (very much related to the $file = $_SERVER['REQUEST_URI']; point)
How can I prevent SQL-injection in PHP?
Secure hash and salt for PHP passwords
Yes, you should move the content out of the document root. You could try using .htaccess to protect your files, but allowing overrides by .htaccess can itself be a security problem. It's certainly a performance problem.
Simply point your 404 handler at something like....
<?php
define('REQUEST_PATH', '/secure');
define('SECURED_CONTENT', '/var/www/restricted');
$req=parse_url($_SERVER["REQUEST_URI"]);
if ((0===strpos($req['path'],REQUEST_PATH))
&& $_SESSION['authenticated']) {
if (is_readable(SECURED_CONTENT . $req['path'])
&& is_file(SECURED_CONTENT . $req['path'])) {
header('Content-type: '
. mime_content_type(SECURED_CONTENT . $req['path']);
include(SECURED_CONTENT . $req_path);
} else {
header('HTTP/1.0 404 Not Found');
}
exit;
}
header('HTTP/1.0 403 Forbidden');
I'm trying to use the define function and the defined function in order to avoid hotlinking / direct accessing a PHP script but for some reason it will not work.
The issue I'm having is that it simply will not work and i recieve the "Hotlinking is not allowed" message even if i visit index.php first and follow the link and / or the post form.
Here is an example of what i'm trying to do:
index.php
<?php
define("ACCEPT",TRUE);
?>
<html>
...
core.php
<?php
if (defined('ACCEPT'))
{
// ACCEPT is defined which means the user came here via index.php
}
else
{
// The user is most likely direct accessing core.php, abort.
echo "Hotlinking is not allowed";
exit;
}
Please note that the post "Preventing Direct Access, is it possible to spoof a php define?" does not answer my question nor does the post "define and defined for disallow direct access".
This is what a fair amount of programs do. Just create a header that checks for the definition and redirect/exit if it isn't defined. There is nothing wrong with doing it this way, but it just adds to the amount of lines/code each page will need. This can be confusing because the DEFINE needs to be in one place, then the page requested has to either be included, or needs to include the page that has the define. It is all about structure.
Here is something you can do:
.htaccess - redirects every request to index.php
index.php - Defines a variable, acts as a router that fetches/includes the page to be shown based on request data.
childpage.php - checks if variable exists (meaning it was included) and then does whatever needs to be done.
The other option is to place the sensitive code in am htaccess protected directory.
You can use a framework as well that does a lot of this.
Or, if your host allows you to edit your vhost config, which they probably won't if you only have access to a public directory, you can change the document root to a higher directory.
I have a website which fetches data from some PHP files to display it on the website. However, to protect my data to be used by other people, I wish to protect my PHP file being called by crawlers, bot etc to gather data.
I have prevented it by checking referral URL , but that can be easily by-passed. So, is there any other way to protect my data . I wish that only my website can call to those files.
Thanks !!
Add Basic HTTP authentication in top of your php file:
if ( !isset($_SERVER['PHP_AUTH_USER']) ||
!isset($_SERVER['PHP_AUTH_PW']) ||
!($_SERVER['PHP_AUTH_USER'] == 'user' && $_SERVER['PHP_AUTH_PW'] == 'pw'))) {
header('WWW-Authenticate: Basic realm="Mirkwood"');
header('HTTP/1.0 401 Unauthorized');
die();
}
If you have Apache web server and in root directory of your site you create an .htaccess file (dot htaccess with no suffix).
Try this syntax to prevent access to specific file types:
<FilesMatch "\.(htaccess|htpasswd|ini|php)$">
Order Allow,Deny
Deny from all
</FilesMatch>
Another way is in all non-index php files you could include something like this:
In index.php, add an access value like this:
$access = 'my_value';
In every other file, include this check before even a single byte is echoed out by php:
if(empty($access)) {
header("location:index.php");
die();
}
I have a website which fetches data from some PHP files to display it on the website.
Move the files that contain the data outside of the document root. Assuming that the PHP files are just being accessed by another inside the docroot.
As suggested by DaveRandom, I finally used a cookie based authentication technique to avoid calling of PHP by other websites.
The server first sets a access code for each valid client. This access code is checked at the beginning of my PHP file.
Cookie is set a max time limit of 5 hrs and cookie is destroyed on window close. This is working pretty fine for me.
Please mention if there is any glitches in this part !!