I have a page called create.php. It receives post variables and sets up accounts. I don't want that page to be accessible by a user. What's the conventional way of achieving this?
I think I remember reading something about including a page with a CONSTANT. If the CONSTANT is not present the page has been accessed directly. I think Wordpress also do it.
Place the the script in a directory other than your server's root, then include it.
If this page receives POST variables, why do you want it to not be accessible by a user?
Anyway, the usual way to ensure that is the following:
<?php
// index.php
define('IN_SCRIPT', true);
require_once('lib/create.php');
?>
<?php
// create.php
defined('IN_SCRIPT') or die('This page cannot be accessed directly.');
// your logic here
?>
Related
The context:
I'm building a PHP 7 web application, it uses a PHP session to login and check to see if the user is logged in on each page. Here is the basic makeup of most pages:
<?php session_start(); ?>
<?php include 'the_header_and_menu.php'; ?>
<p>Page content</p>
<?php include 'the_footer.php'; ?>
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory. This PHP process does five checks, the most basic one included below.
if (!isset($_SESSION['loggedin']) || $_SESSION['loggedin'] == 'false') { // If loggedin is not set, or is false, then run this block.
header('Location: http://example.com/index?eject=noLogin'); // Send the user to the eject page.
die(); // Exit the process.
}
Process summary: User logs in, which creates a session and its variables. When the user loads a page, a session check is performed to make sure that the user's account is valid and authorised. If the account or session is no longer valid/authorised, then the user is redirected to the login page (index).
The issue: When someone who's not logged in enters http://example.com/dashboard, they are ejected using the first check (featured above). However, if they enter http://example.com/process/, the checks seem to count for nothing and the user is shown the page. This page does not just include a directory listing, but calls the http://example.com/process/index.php file to represent it instead.
The question: How can I apply the same logic that protects individual pages like dashboard.php, to the case of protecting directory indexes?
Own answer:
The issue here was one which was simple, but overlooked.
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory.
Within the header and menu file was the session check include. However, because the session check was located outside the main directory (like much of the back-end), I had referenced to it through a relative path, similar to the one below.
include_once '../mainfolder/php/subfolder/sessioncheck.php';
However, because the file was being included to a subdirectory, it should've included a further ../ operator.
include_once '../../safe/php/users/sessioncheck.php';
The solution: Instead of performing a session check through the header and menu, I am now including it on every page I want to protect. This is by no means a perfect solution and simply acts to get things working again.
Thank you to Daniel Schmidt, who got me looking in the right direction!
Directory indexes don't usually come from PHP - they are served by your webserver (nginx, apache, ..). Today, there is obviously no need to have that indexes enabled.
It looks like you're not sending each request to you're PHP process(es). I tend to suggest checking your webserver configuration.
The issue here was one which was simple, but overlooked.
At the top of the_header_and_menu.php file is an include to session_check.php which is located outside the site's directory.
Within the header and menu file was the session check include. However, because the session check was located outside the main directory (like much of the back-end), I had referenced to it through a relative path, similar to the one below.
include_once '../mainfolder/php/subfolder/sessioncheck.php';
However, because the file was being included to a subdirectory, it should've included a further ../ operator.
include_once '../../safe/php/users/sessioncheck.php';
The solution: Instead of performing a session check through the header and menu, I am now including it on every page I want to protect. This is by no means a perfect solution and simply acts to get things working again.
Supposed the page is example.com/blog/data.php. I am using file_get_contents to get the content in another script page. Now, i want to:
Forbid google search to crawl and index the data.php page.
Forbid the visitor to access it
Is there a way to achieve this?
You can redirect to another page if the request url is example.com/blog/data.php, but a far easier and more logical solution would be to move the file out of your web-root.
Edit: If you really want to keep the file inside the web-root, you can use something like this at the top of the script that you don't want to access directly:
if ($_SERVER['REQUEST_URI'] === $_SERVER['SCRIPT_NAME'])
{
header('Location: /'); // redirect to home page
}
However, this will probably not work in combination with file_get_contents (you need to remove these lines from the result), you could include the file instead.
Don't put data.php under the web root. Keep it in a parallel directory.
You can pass token via GET. Overall your way is slightly wrong. Why don't you incorporate the data.php logic in the script that is calling it.
Simply apply access restriction for authorized users only. You are able to do it in the most simple way by accessing your page using url parama as password:
example.com/blog/data.php?secret=someblah
and in the first of your file data.php do the following:
<?php
if (!isset($_GET['secret']) || $_GET['secret'] != 'someblah')) exit();
?>
However,It is recommended, don't use this from public computer becuase it is not secure but it is the primitive authentication principle.
I have several pages inside an AJAX directory. I don't want these pages accessible directly so you cannot just type in the URL of the page within the AJAX directory and access it. I "solved" this by using a PHP session on the page that calls it as follows:
Main page:
<?php
session_start();
$_SESSION['download']='ok';
?>
and on the ajax page I have this:
<?php
session_start();
if($_SESSION['download']!=='ok'){
$redirect='/index.php'; //URL of the page where you want to redirect.
header("Location: $redirect");
exit;}
?>
The only problem is that if a user goes through the correct process once, the cookie is stored and they can now access the page directly. How do I kill the session once they leave the parent page?
thx
why use session ?
if i understood what you want:
<?php /// Is ajax request var ?
if (isset($_SERVER['HTTP_X_REQUESTED_WITH'])) {
if (strtolower($_SERVER['HTTP_X_REQUESTED_WITH'])=="xmlhttprequest") {
// do your ajax code
} else {
// redirect user to index.php since we do not allow direct script access, unless its ajax called
$redirect='/index.php'; //URL of the page where you want to redirect.
header("Location: $redirect");
exit();
}
} ?>
A really simple solution is to open up each of the files you want to protect from direct URL entry & add the following to the top:
<?php if (isset($_GET['ajax']) != true) die();?>
Now get rid of your redirect script since it's useless now. You don't need to use sessions for this. Every time you request a page, use it's direct URL, just add ?ajax=1 to the end of it.
By adding the ?ajax=1, PHP will set a key of 'ajax' to the $_GET global variable with the value of 1. If ?ajax=1 is omitted from the URL then PHP will not set a key of 'ajax' in $_GET and thus when you check if it's set with isset() it will return false, thus the script will die and not output anything. Essentially the page will only output data if ?ajax=1 is at the end of the URL.
Someone could still "spoof" the URL and add '?ajax=1' themselves, but that is not the default behavior for people or web browsers. If you absolutely need to prevent this then it will be much more complicated, e.g. using templates outside of a publicly available folder. Most other "simple" solutions will have the same "spoofing" potential.
There's really no way to accomplish this with a 100% certainty - the problem is, both AJAX and regular web browser calls to your web site are using the same underlying protocol: HTTP. If the integrity and security of your site depends on keeping HTTP clients from requesting a specific URL then your design is wrong.
so how do you prevent people from directly accessing files inside certain directories while still letting the site use them??
Create a controller file. Send all AJAX requests to this controller.
ajax-control.php
<?php
$is_ajax = true;
include "ajaxincludes/test.php";
// ... use the ajax classes/functions ...
ajaxincludes/test.php
<?php
if (!isset($is_ajax) || !$is_ajax)) {
exit("Hey you're not AJAX!");
}
// ... continue with internal ajax logic ...
If clients try to access the file directly at http://mysite/ajaxincludes/test.php they'll get the error message. Accessing http://mysite/ajax-control.php will include the desired file.
I don't think there is a surefire way to do what you are asking, since HTTP request headers can be faked. However, you can use $_SERVER['HTTP_REFERER'] to see if the request appears to be coming from another page on your site.
If the rest of the security on your site is good, the failure of this method would not grant the user access to anything they were not already able to access.
I've never tried this but maybe you could do something with jQuery's .unload() and then call a PHP page to unset() the session.
Why not (on Ajax page):
session_start();
if($_SESSION['download']!=='ok'){
$redirect='/index.php'; //URL of the page where you want to redirect.
header("Location: $redirect");
exit;
}
// do whatever you want with "access granted" user
// remove the download flag for this session
unset($_SESSION["download"]);
I am protecting my pages by checking the values of my sessions. Is there a more secure way of protecting my pages other than changing the Header Location if the sessions are not valid??? Am I doing anything right???
I have the following at the top of each page:
<?php
session_start();
//VERIFY LOGIN
$validkey = 'br1ll1ant)=&';
if ($_SESSION['valid'] != (hash('sha256',$validkey)) && $_SESSION['tokenconfirm'] != hash('sha256',$_SESSION['tokenID'])) {
header("location:/login/");
};
?>
using header() is fine, but don't forget to exit(); your script after calling header(). User agents don't have to respect headers, so one could write a client which will simply read the part that comes after the header call.
if(!session_is_valid()) {
header('Location: index.php');
exit;
}
Are you using a templating system? If you are, what you'd do is simply output the login form instead of the page content if the user isnt validated. Even if you arent using one, you can change the output (different set of includes, for example), if the user isnt valid. This way you arent relying upon the end user's browser to protect the content.
Headers should be fine, I haven't seen people use much anything else.
It is always best to authenticate to gain access to the page, and then check that authentication on every page. If it fails, redirect to the login.
Using a MVC pattern, it is best to check the login status before they even get to a page, and either redirect if not logged in, or load the logged in view.
Using a front controller pattern you can put all your php files outside the web root. That way they are not directly accessible from a URL. This is fairly common practice in PHP frameworks include those built with Zend 'Framework'.
If your files are in the web root, another method that you might consider is to use constants. This is how CodeIgniter does it. Define a constant in your front controller and if its not defined send them to the web root. Here is how to CI uses constants.
The constant used everywhere
<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed');
How it is defined.
define('BASEPATH', $system_folder.'/');
$system_folder being a few lines above.
$system_folder = realpath(dirname(__FILE__)).'/'.$system_folder;
What is the best way to "NOT" display a page directly in php?
Edit
There is a page = register.php
a user cant open register.php directly. Only can access from index.php > Register.php
Thanks
Any PHP files containing sensitive data, such as database password, should be stored outside of the document root and included where needed. That way, if an admin makes a serious mistake and the web server starts sending PHP unparsed, that data will be inaccessible.
Edit
You edited your question and it now seems you wish to prevent access to page without them coming from a particular page. You should be able to get some ideas from these questions:
deny direct access to a php file by typing the link in the url
preventing direct access to a php page, only access if redirected
I think you want something like this:
if ( $_SERVER['HTTP_REFERER'] != 'http://YOUR_SITE/index.php' ) {
echo "Can't access this page from this referer";
die();
}
// go on with your register.php code
You can put
die();
or
exit();
At the top of your PHP document. However, your question is unclear as to what you wish to accomplish.
You can start a session in index.php and check for a certain variable from that session in the other pages.
make a file index.php
in it put
<?php
include 'register235235235235.php';
?>
make a file register235235235235.php
put whatever you want in there
As far as securing php includes, I only secure my database.php files which contain usernames and passwords.