PHP: prevent direct access to page - php

I have some pages that I don't want users to be able to access directly.
I have this function I came up with which works:
function prevent_direct_access()
{
if($_SERVER['REQUEST_URI'] == $_SERVER['PHP_SELF'])
{
//include_once('404.php');
header("Location: 404.php");
}
}
This does exactly what I want, the URL does not change but the content does. However I am wondering if there is something I need to add to tell search engines that this is a 404 and not to index it. keep in mind I do not want the URL to change though.
Thanks!

Don’t redirect but send the 404 status code:
header($_SERVER['SERVER_PROTOCOL'].' 404 Not Found', true, 404);
exit;

for the search engines, if you return HTTP status 404 they should not index I believe. But you could always redirect to somewhere covered by a robots.txt

Just to clarify:
You have some PHP that you want available to other PHP programs on the system
You do not want anybody accessing it except by running one of the other PHP programs
(i.e. "direct" doesn't mean "except by following a link from another page on this site")
Just keep the PHP file outside the webroot. That way it won't have a URL in the first place.

To ensure Search Engines don't index it, use a header() command to send a 404 lke this;
header("HTTP/1.0 404 Not Found");
Or put all such files in one folder, "includes" say, and add a "Deny /includes/" into your robots.txt file. This way, you can also add a ".htaccess" file in the same directory with one line - "Deny From All" - this will tell Apache to block access (if apache is configured properly), for another layer of security.

Related

Redirection from the php file

I'm looking for a little help from someone who knows PHP,
in short, it looks like this:
I once made a website for a customer with products on the Jomla CMS system, and the product was located, for example, at:
www.mysite.pl/index.php/productall/product1
www.mysite.pl/index.php/productall/product2
e.t.c...
currently his site has been changed and is in the classic html5 / css / js without any CMS, but in the same old directory
and the product is on:
www.mysite.pl/product1.html
The client created QR codes for himself, but to the old page and printed on each product
and now I would like to create an index.php file, which will redirect the traffic depending on the call in the browser, because now as the QR code is being scanned it jumped out because there is no such adreas as before.
For now, I created an index.php file that redirects to
www.mysite.pl/allproduct.hml
where is the list of all products but I would prefer it to be as before
that is, each QRcode was redirected to the chosen product.
Can you do it somehow? PLEASE HELP
now i have this code in index.php
Instead of messing with PHP, You could try adding redirect commands within the htaccess file? (if you can still use one?) The ".htaccess" file should be located in the site root.
ie: /public_html/.htaccess
within this file you can create redirects from the old address to the new one.
Redirect 301 /oldProductLink1.html http://www.example.com/NewProductPage1.html
Redirect 301 /oldProductLink2.html http://www.example.com/NewProductPage2.html
:)
There's no easy solution you must create a .htacess file and place on your root folder then redirect all the old urls to new url for each product
Redirect 301 /index.php/productall/product1 www.mysite.pl/product1.html
Redirect 301 /index.php/productall/product1 www.mysite.pl/product2.html
Redirect 301 /index.php/productall/product1 www.mysite.pl/product3.html
Redirect 301 /index.php/productall/product1 www.mysite.pl/product4.html
If the question is only about PHP and there is no way to use web server URL rewriting or redirect, then you should create files like :
/index.php/productall/product1
/index.php/productall/product2
Where index.php would be a directory. Each file should contain something like :
header('location: /product1.html');
exit();
Where the product number should be set according to file name.
i still have this:
The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.
I removed this my test index.php file and created .htaccess in the root directory with yours codes

Serve webpages with PHP, WordPress style

For a "classic" website, one would create a /foldername/index.php for every web page. With WordPress, however, this is not the case. For example, if a page was created with WordPress whose URI was http://myblog.org/some_page, you would not find the folder www/myblog.org/some_page in your web host's FTP.
My question then, is, How can I serve up pages located at http://[MY_WEBSITE].com/[page_name] for any arbitrary page_name, without creating a new folder for every page_name?
One method would be to use the page_name as parameter to a common file and use that to serve the contents of the required page.
That behaviour is handled (in an Apache server) by a .htaccess file, wherein rewrite rules are defined. Rewrite rules basically capture incoming traffic and directs those requests to a file on the server (typically a single page which will act as a router).
The router is then responsible for taking the input URI (usually via $_SERVER["REQUEST_URI"] in PHP) and working out what to do with it, and ultimately what the output will be for that request.
As for a decent router, you could look at klein.php. Also, a brief example:
# htaccess file
RewriteEngine On
RewriteRule ^[^\.]+$ index.php
And the index.php:
$route = $_SERVER["REQUEST_URI"];
if($route === '/home')
{
echo 'This is the homepage';
}
You tell your server to rewrite the URL. Most servers do it in their own way, so to find out how to do it look at your server's documentation.
Wordpress uses templates that make use of the require() function and a foreach loop commonly called "The Loop" to retreive content.
Different pages are called using different templates. If you want to know exactly how this logic is calculated, look into this.

Is there a way using php or htaccess to prevent access to a certain part of an url

For example a user can not access http://example.com/here but can access anything with more added. An example would be http://example.com/her?action=blah blah.
The point of this is to prevent access to an file by visiting the page directly, however allow the file to be use with more actions/links added.
I would just like to thank everyone for the quick response.
Based off the answers I got, I went with php and it worked.
I used the following:
function access_granted(){
global $pagenow;
if(!isset($_GET['action']) && 'wp-login.php' == $pagenow ) {
wp_redirect('http://example.com/');
exit();
}
}
I do have one question however.
I changed the wp-login.php url to http://example.com/here instead of http://example.com/wp-login.php.
My problem is that now, the function access_granted does not work anymore.
my guess is the $pagenow.
Any suggestions, such as how to check the url/here instead of wp-login.php?
If you want to go .htaccess way do this. Enable mod_rewrite and .htaccess through httpd.conf and then put this code in your .htaccess under DOCUMENT_ROOT directory:
Options +FollowSymLinks -MultiViews
# Turn mod_rewrite on
RewriteEngine On
RewriteBase /
RewriteCond %{QUERY_STRING} ^$
RewriteRule ^here/?$ - [NC,L,F]
This will throw forbidden error for /here or /here/ but will allow /here?bar=foo type of URIs.
Yes, there is.
You don't really need to deal with htaccess, you can do it with php.
First you need to check if required options are set. For example if you have a GET property with name action you can check whether or not it's set using isset function. You also need to specify a custom HTTP 403 page and redirect the user to that page. Here's an example for showing forbidden error if action is undefined:
<?php
if (!isset($_GET['action'])) {
header('HTTP/1.0 403 Forbidden',403);
header('Location: 403.html');
exit;
}
?>
You might also find the following link useful:
How to display Apache's default 404 page in PHP
Is example.com/here handled by a PHP script? If so, do your input validation at the start of the script, and if it doesn't meet the requirements, either give an error page or redirect (http://php.net/manual/en/function.http-redirect.php) to another page.
As you want it to work if anything is appended to the URL, you can use this if /here is handled by PHP:
<?php
if(!$_GET){
die('You are not allowed to access this URL.');
}
else {
... PHP code that will execute in case the user is allowed to access the URL(you can also use HTML by just having ?> here instead and add <?php } ?> at the end of the document) ...
}
?>
Hope it helped.
Sure, you can use PHP for this. on here.php you need a line (in your example) at the top of your code that looks like this:
if(!isset($_GET['action']){
header('location:/notallowed.php');
exit;
}
Where notallowed.php is whatever you want to show when they don't have the full link.

How can I use a .htaccess rewrite to a page that has a php header redirect?

this is my first post so go easy on me.
Basically I am doing some rewrites in my htaccess file to change my made up search friendly URLs into actual URLs, and for the most part they are working. For instance this:
http://www.negativeworld.org/7849/news/nintendo-download-for-may-24-2012
Will turn into this:
http://www.negativeworld.org/article.php?id=7849
Just fine... IF that article exists. If the article doesn't exist, the php code uses this:
header("Location: boarderror.php");
exit;
To bring the user to boarderror.php. This works fine if it the user gets there directly on article.php and the id is bad, but when I am trying to do the htaccess redirect from a search friendly url and the id is bad, the htaccess redirect just hangs for awhile before giving me this message: "The page isn't redirecting properly".
What I want is for it to go to my boarderror.php page when there is a bad id. So basically I want my htaccess page to take a server friendly URL, switch to the true URL, and well... just let go at that point, and the PHP will take it from there. Here is my htaccess line that does the switch:
RewriteRule ^([0-9]+)/(news|review|editorial|podcast)/(.*)$ /article.php?id=$1 [L]
What am I doing wrong? (BTW I realize that if I set up all of my search friendly URLs correctly there should never be a bad id anyway, but I want to be on the safe side...)
Your thoughts aren't wrong. For a wrong ID there is a double redirection which is OK. The problem is how the second redirection happens. Try
header("Location: http://www.negativeworld.org/boarderror.php");
or
header("Location: /boarderror.php");
With your redirection the browser is trying http://www.negativeworld.org/9999/news/boarderror.php (being 9999 the wrong ID) which falls in an endless redirection loop that the browser cuts after 10 tries.
The redirect rule is fine, the issue is in your header function call. When you only provide a file name, the header redirect will send the user to the file in the same folder, much like creating an html link using only the filename.
Let's say i try to load http://www.negativeworld.org/99999/news/nintendo-download-for-may-24-2012 and that id is invalid. In this case it would send send the user to http://www.negativeworld.org/99999/news/boarderror.php which triggers the redirect again and gets stuck in an infinite loop (or would if the browser wasn't smart enough to stop requesting the same URL over and over again).
Per RFC 2616 the location header should provide an absolute URI, so you should do something like this:
header("Location: http://www.negativeworld.org/boarderror.php");
exit;

how to protect server directory using .htaccess

I have designed a website, and within it I have a range of PHP scripts which interact with my system. For example, if a user uploads an image, this is processed by the script
image.php
and if a user logs in this is processed by the script
login.php
All these scripts are stored in the folder called: scripts
How do I ensure someone cannot access these pages, however still ensure they can be used by the system? I want to ensure the PHP pages will accept post values, get values and can redirect to other pages, but not be directly accessed via the address bar or downloaded?
I attempted to block access using .htaccess using deny from all and Limit GET, POST but this prevented the system from working as I could not access those files at all.
Blocking files with htaccess makes the files inaccessible to the requestor, e.g. the visitor of the page. So you need a proxy file to pass the visitor's request to the files. For that, have a look at the MVC pattern and the Front Controller pattern.
Basically, what you will want to do is route all requests to a single point of entry, e.g. index.php and decide from there, which action(your scripts) is called to process the request. Then you could place your scripts and templates outside the publicly accessible folder or, if that is impossible (on some shared hosts), protect the folders with htaccess like you already did (DENY FROM ALL) then.
To use the upload script you'd have a URL like http://example.com/index.php?action=upload.
A supersimple FrontController is as easy as
$scriptPath = 'path/to/your/scripts/directory/';
$defaultAction = 'action404.php';
$requestedAction = $_GET['action']; // you might want to sanitize this
switch($action) {
case 'upload':
$actionScript = 'image.php';
break;
case 'login':
$actionScript = 'login.php';
break;
default:
$actionScript = $defaultAction;
}
include $scriptPath . $actionScript;
exit;
Your actionScript would then do everything you need to do with the request, including redirection, db access, authentication, uploading stuff, rendering templates, etc - whatever you deem necessary. The default action in the example above could look like this:
<?php // action404.php
header('HTTP/1.1 404 File Not Found');
fpassthru('path/to/template/directory/error404.html');
There is numerous implementations of the FrontController pattern in PHP. Some simple, some complex. The CodeIgniter framework uses a lightweight MVC/FrontController implementation that might not be too overwhelming if this is new to to you.
Like Atli above suggested, you could use mod_rewrite to force all requests to index.php and you could also use it to pretty up your URLs. This is common practice with MVC frameworks and has been covered extensively here and elsewhere.
You can't really prevent direct requests to the files, and still have them remain accessible to other requests. The best you can do is mask their location, and control how they are accessed.
One way you could go is to create a PHP "switch" script, which would include the scripts for you, rather than have Apache request them directly.
For example, if you had your scripts/image.php rule target switch.php?file=image.php instead, somewhat like:
RewriteRule ([^\.]+\.(jpe?g|png|gif)$ switch.php?file=image.php&rw=1&meta=$1 [L,QSA]
You could add deny from all to the scripts/.htaccess file and do this in your switch.php file.
<?php
/** File: switch.php **/
$allowed_files = array(
'login.php',
'image.php'
);
$script_dir = 'scripts/';
if(isset($_POST['rw']) && in_array($_REQUEST['file'], $allowed_files)) {
include $script_dir . $allowed_files[$_REQUEST['file']];
}
else {
header('HTTP/1.1 404 File Not Found');
include 'error404.html'; // Or something to that effect.
}
?>
The $_POST['rw'] there is a weak check, to see if the rule came from a RewriteRule, meant to prevent direct requests to the file. Pretty easy to bypass if you know it is there, but effective against random requests by bots and such.
This way, direct requests to either scripts/image.php and switch.php?file=image.php would fail, but requests to any image file would trigger the scripts/image.php script.
You can set deny from all on .htaccess and include these files from some accessible directory
I want to ensure the PHP pages will accept post values, get values and can redirect to other pages, but not be directly accessed via the address bar or downloaded?
As long as Apache is configured to associate all .php files with the PHP application, no one can download the PHP content itself. So, if someone browsed to "mysite.com/image.php", PHP will run. The user will NOT see your PHP content.
This should already by done in your httpd.conf file as :
AddType application/x-httpd-php .php .phtml
Now, image.php will be expecting certain post parameters. Short of implementing an MVC architecture as Atli suggested above, you could gracefully and securely deal with any missing parameters if they aren't provided. Then, users can get to the page directly but not do anything with it.
A lot of applications just put files like your scripts not in the public (like /public_html/ or /www/) folder but in the same root folder as your public folder.
so not
root/public_html/ and
root/public_html/scripts/
but
root/public_html/ and
root/scripts/
Anything in a folder above the public folder can't be accessed by visitors, but by specifying in for example /public_html/index.php the file '../scripts/yourscript.php' PHP can access these files and visitors can't. (the folder ../ means "go up one step in the folder hierarchy")

Categories