Secure files from direct access - php

I have hosted a site, the documents suggest to put files under folder public_html.
I have three files index.php(view page), common.js, and result.php(php) in root folder. On clicking a button in index.php(view) file will trigger an ajax function to result.php.
The problem is everyone can access the result.php directly...
I trying to make folder structure, that all php files(result.php) are in folder behind root. So it will not accessed directly from browser using rewrite rule or anything else.
Please help me to solve this issue...

To make a file only acccessible via ajax you can use:
public static function isAjax() {
return (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && $_SERVER['HTTP_X_REQUESTED_WITH']=="XMLHttpRequest");
}
It returns true or false. Basically if it returns true then let the user carry on otherwise stop them.
Word of caution: Not all JS libraries/frameworks actually set this header but most do (JQuery, Mootools etc) and not all versions so make sure you have the latest version of a library/framework before you use this.
Plus if the user spoofs your headers then there is no real way to stop them.
I tend to use this as a precursor for stopping AJAX pages from being visible publicly. I also use parameter integrity checking and a random hash stored in session (CSRF type thing) to check if the user is legitamately accessing an AJAX page.

You can't protect it by moving it around, because there is no way to distinguish if a request to result.php was triggered by a legitimate AJAX call from index.php except for a session (or some other type of token).
You need to use a php session (or something equivalent) to:
Store what the use has access to (in index.php).
Check if he has access to it in (result.php)

You can't make a file accessible via ajax and then not accessible via the correct browser requests, as the Ajax call is doing the same behaviour a web-browser could.

Related

preventing anyone from accessing back server pages

I searched for the answer for my question but I couldn't find exactly what I wanted.
If you find a duplicate of this please send me it!
I have a couple of files in my website that are used to do background functions that I don't want anyone to access them- not even the admin. for example files like PHPMailer.php, login-inc.php logout-inc.php and more.
I need a way to prevent anyone from accessing those pages and not prevent them from working when triggered by buttons/forms.
I'm aware that using a session can redirect not logged users, although, here, I need to prevent everyone from accessing the pages by redirecting them or sending them to a 404 page.
what do I need to use to do that?
thanks!
Update: I'm very new to web coding so sorry for the confusing question, I wanted to block users from entering some pages by entering their location with a link for example I don't want users to be able to access tokens/passwords...
Using .htaccess solves my problem. thank you.
One way to protect your files to be called by web server is to move them out of site webroot directory. That way there is no way that someone access the with web browser and you still can include them. It's common solution.
Other way is to intercept web server requests and i.e. forbid some of them, redirect some others and so on. I.e for Apache web server you can do that inside .htaccess file. You have to allow that in website settings.
For your specific case, with those buttons:
You'll have to use .htaccess (or equivalent) to intercept all requests to those files. Then redirect those request to some php script, with also saving passed parameters.
Then your PHP script should decide what to do with that request...reject it (redirect to 404 page) or allow access.
For that your buttons, should pass some kind of pass code. So your PHP script can check, when it's called if valid pass code is provided (allow access) or not (redirect to 404).
Now making that pass code that can't be manipulated could be tricky, but generally you must invent some formula to generate them (based i.e. on current time) so PHP script could you the same formula to check it's validity.
Other way is to i.e. to do some JS action when button is pressed (i..e write some cookie) and PHP script will check for that JS action result (cookie exists or not).

Block direct access to PHP file except from AJAX request?

I wish to have a webpage that uses AJAX to access a PHP file in ./ajax/file.ajax.php
Trouble is, I don't want people to be able to type the address in their browser to access that PHP file directly.
Is there a way I can make it so that only AJAX requests can access the file?
Is there something I can check for in the PHP file to achieve this?
If you're using jQuery to make the XHR, it will set a custom header X-Requested-With. You can check for that and determine how to serve your response.
$isXhr = isset($_SERVER["HTTP_X_REQUESTED_WITH"])
AND strotlower($_SERVER["HTTP_X_REQUESTED_WITH"]) == "xmlhttprequest";
However, this is trivial to spoof. In the past, I've used this to decide whether to render a whole page (if not set) or a page fragment (if set, to be injected into current page).
If you're not using jQuery or you are not interested/you can't use custom headers (to go with what alex has offered), you may just simple POST some data with your Ajax request, and in that specific file check if that data has sent or not. If you send by GET it would be visible on the address bar, that's why I suggest POST.
<?php
if (empty($_POST['valid_ajax']))
header('Location: /');
?>
It's not solid as you can fool that with providing handmade data, however that's better than nothing if your problem is not that critical.

proper way to craft an ajax call with php and auth

From a security standpoint, can someone give me a step-by-step (but very simple) path to securing an ajax call when logged in to PHP?
Example:
on the php page, there is a session id given to the logged in user.
the session id is placed dynamically into the javascript before pushing the page to the client.
the client clicks a "submit" button which sends the data (including the session id) back to the php processing page.
the php processing page confirms the session id, performs the task, and sends back data
I'm stuck on how (and whether) the session data should be secured before sending it through an ajax request. I'm not building a bank here, but i'm concerned about so many ajax calls going to "open-ended" php pages that can just accept requests from anywhere (given that sources can be spoofed).
PHP can get the session data without you having to send a session ID via javascript. Just use the $_SESSION variable. If you want to check if a session exists you can just do
if(isset($_SESSION['some_val'))
//do work son.
You'll need to use JavaScript to asynchronously pass user input back to the server, but not to keep track of a session.
Don't send your session data with javascript.
You don't need to (in most cases).
Just post the data with javascript and let PHP retrieve the session data from... the session.
Depends on how you setup your session data.
One simple example would be you have a session called username.
When PHP gets the request from javascript you can do: $_SESSION['username'] to retrieve the sessiondata.
This is a very simple example just to show how it can be done.
As noted above, you don't need to send any session identifiers out with your javascript, to the server an AJAX request is the same as any other request and it will know your session just fine. So basically, just don't worry about it, it's already taken care of.
It's another part of your question that worries me.
i'm concerned about so many ajax calls going to "open-ended" php pages that can just accept requests from anywhere
It worries me too; you shouldn't have any "open-ended" PHP pages hanging around at all. Every public .php script should have authentication and authorisation done. The easiest and most maintainable way to achieve this, IMHO, is to have a single controller script (e.g. index.php) that does authentication and authorisation then sends the request to an appropriate controller. Aside from this controller, all other scripts should be outside the document root so that they cannot be called directly.
This means that you only ever have to worry about authentication and authorisation in one place; if you need to change it, it only changes in one place. It means you don't need to worry about accidentally leaving some executable stuff in some library PHP file that's not meant to be called directly. It means you don't need to shag around with mod_rewrite rules trying to protect .php files that shouldn't be in the doc root at all.

How to make sure a human doesn't view the results from a PHP script URL?

How to make sure a human doesn't view the results from a PHP script URL?
Recently when viewing the source of a site that was making an AJAX call, I tried to follow the link in the browser
www.site.com/script.php?query=value
Instead of getting the result I expected to see, I saw a message stating only scripts should view that page.
How do you restrict a script to only allowing a script to access it?
UPDATE:
here is the page DEMO page
Short answer: you can't.
Long answer: You can make it harder to do it by requiring special header values in the HTTP request (setting Accept to application/json is a common one). On the server side just check to make sure that header is set to the value you expect. This will make it so that regular users will get the message you mention and your scripts will work just fine. Of course advanced users will be able to easily work around that sort of limitation so don't rely on it for security.
with php you can check for and only display results if the page is called via ajax
function isAjax() {
return (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && ($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest'));
}
if(isAjax()) {
// display content
} else {
// not ajax, dont show
echo 'Invalid Request';
}
You can't. A human being can always spoof the request. You can send your request with a post variable, to make sure a human doesn't end up on the page by accident.
One possible solution is to check the HTTP Request for it's origin.
Another solution is to send a "password" with every request. Take a look into this tutorial how to do this.
But it's never 100% secure, it only makes it harder for possible intruders.
As Tim stated, this script is almost certainly looking for this request header, which is being sent with each request to rpc.php (found via the net panel in firebug, naturally):
X-Requested-With : XMLHttpRequest
As to cross-browser compatibility, the setRequestHeader method appears to be available with both the activex and xmlhttprequest connections so this should work in all major modern browsers.
If you are calling the script by AJAX, then it MUST be accessible for you because an AJAX call is similar to your browser actually asking for the page, thus it is not only script accessible but accessible to anyone.
If it was actually called by PHP or by some other means, you could "maybe" use Apache rules or PHP scripting to diminish the accessibility.
You could set a secret value into the php session with the 'view' script and check for it with the ajax scripts.
Request 'index.php' with the
browser.
PHP builds the page, saves a key into
the session, sends the content back
to the browser.
The browser gets the page content and
makes some ajax request to your site.
Those ajax scripts also have access
to the same session your main page
did, which allows you to check for a
key.
This insures only authenticated browsers are allow to make the ajax requests.
Don't count on the ajax request being able to write to the session though. With many requests being satisfied at the same time, the last one in will be the last one written back to your session storage.
http://us.php.net/manual/en/book.session.php
A lot of open source applications use a variation of this on top of every php file:
if (!defined('SOMETHING')) {
die('only scripts have direct access');
}
Then in index.php they define SOMETHING:
define("SOMETHING", "access granted.");
edit: I'm not saying this is a good approach btw
edit2: Seems I missed the part about it being an ajax request. I agree in this case this isn't a solution.

Prevent direct access to a PHP page

How do I prevent my users from accessing directly pages meant for ajax calls only?
Passing a key during ajax call seems like a solution, whereas access without the key will not be processed. But it is also easy to fabricate the key, no? Curse of View Source...
p/s: Using Apache as webserver.
EDIT: To answer why, I have jQuery ui-tabs in my index.php, and inside those tabs are forms with scripts, which won't work if they're accessed directly. Why a user would want to do that, I don't know, I just figure I'd be more user friendly by preventing direct access to forms without validation scripts.
There is no way of guaranteeing that they're accessing it through AJAX. Both direct access and AJAX access come from the client, so it can easily be faked.
Why do you want to do this anyways?
If it's because the PHP code isn't very secure, make the PHP code more secure. (For example, if your AJAX passes the user id to the PHP file, write code in the PHP file to make sure that is the correct user id.)
As others have said, Ajax request can be emulated be creating the proper headers.
If you want to have a basic check to see if the request is an Ajax request you can use:
if($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest') {
//Request identified as ajax request
}
However you should never base your security on this check. It will eliminate direct accesses to the page if that is what you need.
It sounds like you might be going about things the wrong way. An AJAX call is just like a standard page request, only by convention the response is not intended for display to the user.
It is, however, still a client request, and so you must be happy for the client to be able to see the response. Obfuscating access using a "key" in this way only serves to complicate things.
I'd actually say the "curse" of view source is a small weapon in the fight against security through obscurity.
So what's your reason for wanting to do this?
If the browser will call your page, either by normal request or ajax, then someone can call it manually. There really isn't a well defined difference between normal and ajax requests as far as the server-client communication goes.
Common case is to pass a header to the server that says "this request was done by ajax". If you're using Prototype, it automatically sets the http header "X-Requested-With" to "XMLHttpRequest" and also some other headers including the prototype version. (See more at http://www.prototypejs.org/api/ajax/options at "requestHeaders" )
Add: In case you're using another AJAX library you can probably add your own header. This is useful for knowing what type of request it was on the server side, and for avoiding simple cases when an ajax page would be requested in the browser. It does not protect your request from everyone because you can't.
COOKIES are not secure... try the $_SESSION. That's pretty much one of the few things that you can actually rely on cross-page that can't be spoofed. Because, of course, it essentially never leaves your control.
thanks, albeit I use
define('IS_AJAX', isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest');
if(IS_AJAX) {
//Request identified as ajax request
}
cheers!
Not sure about this, but possibly check for a referrer header? i think if someone manually typed in your url, it wouldn't have a referrer header, while AJAX calls do (at least in the quickly test I just did on my system).
It's a bad way of checking though. Referrer can be blank for a lot of reasons. Are you trying to stop people from using your web service as a public service or something?
After reading your edit comments, if the forms will be loaded via ajax calls, than you could check window.location to see if the url is your ajax form's url. if it is, go to the right page via document.location
This definitely isn't useful for securing something.. but I think this could be of use if you wanted to have say a php page that generated a whole page if the page was not requested by ajax but only generate the part that you needed returned when ajax was used.. This would allow you to make your site non ajax friendly so if say they click on a link and it's supposed to load a box of comments but they don't have ajax it still sends them to the page that is then generated as a whole page displaying the comments.
Pass your direct requests through index.php and your ajax requests through ajax.php and then dont let the user browse to any other source file directly - make sure that index.php and ajax.php have the appropriate logic to include the code they need.
In the javascript file that calls the script:
var url = "http://website.com/ajax.php?say=hello+world";
xmlHttp.open("GET", url, true);
xmlHttp.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
then in the php file ajax.php:
if($_SERVER['HTTP_X_REQUESTED_WITH'] != "XMLHttpRequest") {
header("Location: http://website.com");
die();
}
Geeks can still call the ajax.php script by forging the header but the rest of my script requires sessions so execution ends when no valid session is detected. I needed this to work in order to redirect people with expired hybridauth sessions to the main site in order to login again because they ended up being redirected to the ajax script.

Categories