Blocking non-AJAX requests to PHP [duplicate] - php

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Prevent Direct Access To File Called By ajax Function
I'm creating a site that relies on AJAX calls to to a PHP page. Is there a way to prevent access to the raw data? (i.e. accessing the php file via their own post requests).
I would guess the best way to do this (if possible) would be to prevent PHP from sending data to anything that doesn't come from AJAX (since that has to come from the same domain). Any suggestions?

if (strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest') {
//allow access
}
else
die("Direct access restricted");
It's cheatable though

You could just not return anything when the request doesn't have the proper GET or POST variables.
With that said, its honestly not anything to worry about as anybody who actually browses the page that you send ajax requests to is probably trying to do something malicious and them receiving what is sent via ajax doesn't gain them anything. No normal user every views source...

Ajax libraries add a X-Requested-With: XmlHttpRequest header in their requests, so you can test for its presence:
if (!isset($_SERVER['HTTP_X_REQUESTED_WITH']) || $_SERVER['HTTP_X_REQUESTED_WITH'] != 'XmlHttpRequest')) {
// not an ajax request
}
However a malicious user can easily send this header too, so don't use this to protect sensitive data.

As #Evan linked in his comment, you can detect XmlHttpRequest requests by looking for HTTP_X_REQUESTED_WITH in $_SERVER. But this value comes from a header send by the client. As with any information from the user though, this can be spoofed.
There's really no way to block non-XmlHttpRequest in a practical way. If it's really important that you block the API, you can issue a unique key to the Javascript (and store it in the session) upon a request to the main page. It is passed in the XmlHttpRequest, and when that page sees it and validates it, it gives access. But even that unique key can be scraped from the page.

Related

How to block users from directly accessing a page, but not HTTP requests in PHP?

Before I start, I hope that this question isn't that badly written. My last questions got negative attention due to the sheer ugliness of the question formatting. Either way, here's my question:
I'm making a program where I have to send GET requests to my domain to get information and statistics, etc.
Though, my problem is: how would I efficiently (and in PHP only) stop the typical user/person/cat/etc from accessing my page, and only let HTTP requests in?
Example: I send a GET request to "foo.php" on my domain from an external program. User knows I'm getting content from the page and tries to visit the page itself directly.
How would I stop the user from seeing the page in their browser directly, but perfectly allow HTTP requests (such as GET requests) to fetch my content?
Actually a browser is also sending a HTTP GET request, so you need a different approach to distinguish between a GET made buy your script/service and one from a browser.
You have a lot of different approaches, here 2 possible solutions:
A) Use a particular user agent when you do your get request. This is the de-facto standard for monitoring services to identify the request.
if ($_SERVER['HTTP_USER_AGENT'] != "your_user_agent") {
die();
}
B) Use a special token to authorise your request
// if you like to send the token as parameter like foo.php?auth=bar
if ($GET['auth'] != "your_token") {
die();
}
// or use this if you like to send it as a header named auth
if ($_SERVER['auth'] != "your_token") {
die();
}
Easiest way would be to add a condition in foo.php that checks for the $_GET parameter and then stop at that point.
if (!isset($_GET)) {
die();
} else {
// Regular programming
}
Of course that doesn't then someone from visiting foo.php?doesthisvariablework=1and getting through.

How to deny ajax PHP files from browser

There is AJAX script on my WS.
Is there a method to deny straight access to ajax php backend?
And to access to it only if it is run from my ajax code
You can try heuristics (such as examining X-Requested-With HTTP header) but NOT as any security measure. Any such difference in how the request looks can easily be duplicated by anyone who really wants to.
The answer is no.
The way your ajax calls access the php scripts is just as direct as any other method.
That said, you can limit the access to your scripts in different ways, such as requiring a valid session which is created only after a login. However, once a user has logged in, accessing the backend via an ajax script or "directly" are both fair game. In other words, you cannot count on being able distinguish an ajax call from some other call at the server side.
The security of your backend needs to depend on somewhere else.
On server-side you can add this to the top of your backend files:
if(empty($_SERVER['HTTP_X_REQUESTED_WITH']) || strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) != 'xmlhttprequest') {
die("You need to use an AJAX request");
}
Edit: As stated by others, this is not reliable as a security measure.

Jquery $.post and PHP - Prevent the ability to use script outside of main website

I have a PHP script setup using Jquery $.post which would return a response or do an action within the targeted .php file within $.post.
Eg. My page has a form where you type in your Name. Once you hit the submit form button, $.post is called and sends the entered Name field value into "mywebsite.xyz/folder/ajaxscript.php"
If a user was to visit "mywebsite.xyz/folder/ajaxscript.php" directly and somehow POST the data to the script, the script would return a response / do an action, based on the submitted POST data.
The problem is, I don't want others to be able to periodically "call" an action or request a response from my website without using the website directly. Theoretically, right now you could determine what Name values my website allows without even visiting it, or you could call an action without going through the website, by simply visiting "mywebsite.xyz/folder/ajaxscript.php"
So, what measures can I take to prevent this from happening? So far my idea is to ensure that it is a $_POST and not a $_GET - so they cannot manually enter it into the browser, but they could still post data to the script...
Another measure is to apply a session key that expires, and is only valid for X amount of visits until they revisit the website. ~ Or, just have a daily "code" that changes and they'd need to grab this code from the website each day to keep their direct access to the script working (eg. I pass the daily "code" into each post request. I then check that code matches in the ajax php script.)
However, even with these meaures, they will STILL have access to the scripts so long as they know how to POST the data, and also get the new code each day. Also, having a daily code requirement will cause issues when visiting the site at midnight (12:00am) as the code will change and the script will break for someone who is on the website trying to call the script, with the invalid code being passed still.
I have attempted using .htaccess however using:
order allow,deny
deny from all
Prevents legitimate access, and I'd have to add an exception so the website's IP is allowed to access it.. which is a hassle to update I think. Although, if it's the only legitimate solution I guess I'll have to.
If I need to be more clear please let me know.
The problem you describe is similar to Cross-Site Request Forgery (CSRF or XSRF). To protect you against this you could put a cookie into the browser and have the cookie value sent in the post form too (by hidden field or just add it to $.post). On server side check both those fields, if they match the request probably came from your site.
However the problem you describe will be quite hard to protect against. Since you could easily make a script (or use Crul) to forge all kinds of requests and send to your server. I don't know how to "only allow a browser and nothing else".
Use the Session variable as you say plus...
As MyGGAN said use a value set in a cookie (CVAL1) before rendering the submit forms. If this cookie is available (JS Code Check will verify) then submit.
On the server side:
If this cookie value exists and the session variable exist then the HTTP Request came from your website.
Note: If the script (form) is to presented under another domain DO NOT allow the cookie value (CVAL1) to be set.
Do not allow HTTP Requests on the Server Side Scripts if extra Http Headers Are not available (like x-requested-with: jquery). JQuery sends a request with an X-* header to the server.
Read more on Croos-Site Request Forgery as MyGGAN suggests.
I am not really sure REMOTE_ADDR would work. Isnt that supposed to be the end users IP addr?
Firstly, you could make use of
$_SERVER['HTTP_REFERER'], though not always trust-able.
The only bet that a valid post came from your page would be use a captcha.
try to use HTTP_SEC
// SECURITER
if ($_SERVER[HTTP_SEC_FETCH_SITE] != "same-origin")
die();
if ($_SERVER[HTTP_SEC_FETCH_MODE] != "cors")
die();
if ($_SERVER[HTTP_SEC_FETCH_DEST] != "empty")
die();

How to make sure a human doesn't view the results from a PHP script URL?

How to make sure a human doesn't view the results from a PHP script URL?
Recently when viewing the source of a site that was making an AJAX call, I tried to follow the link in the browser
www.site.com/script.php?query=value
Instead of getting the result I expected to see, I saw a message stating only scripts should view that page.
How do you restrict a script to only allowing a script to access it?
UPDATE:
here is the page DEMO page
Short answer: you can't.
Long answer: You can make it harder to do it by requiring special header values in the HTTP request (setting Accept to application/json is a common one). On the server side just check to make sure that header is set to the value you expect. This will make it so that regular users will get the message you mention and your scripts will work just fine. Of course advanced users will be able to easily work around that sort of limitation so don't rely on it for security.
with php you can check for and only display results if the page is called via ajax
function isAjax() {
return (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && ($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest'));
}
if(isAjax()) {
// display content
} else {
// not ajax, dont show
echo 'Invalid Request';
}
You can't. A human being can always spoof the request. You can send your request with a post variable, to make sure a human doesn't end up on the page by accident.
One possible solution is to check the HTTP Request for it's origin.
Another solution is to send a "password" with every request. Take a look into this tutorial how to do this.
But it's never 100% secure, it only makes it harder for possible intruders.
As Tim stated, this script is almost certainly looking for this request header, which is being sent with each request to rpc.php (found via the net panel in firebug, naturally):
X-Requested-With : XMLHttpRequest
As to cross-browser compatibility, the setRequestHeader method appears to be available with both the activex and xmlhttprequest connections so this should work in all major modern browsers.
If you are calling the script by AJAX, then it MUST be accessible for you because an AJAX call is similar to your browser actually asking for the page, thus it is not only script accessible but accessible to anyone.
If it was actually called by PHP or by some other means, you could "maybe" use Apache rules or PHP scripting to diminish the accessibility.
You could set a secret value into the php session with the 'view' script and check for it with the ajax scripts.
Request 'index.php' with the
browser.
PHP builds the page, saves a key into
the session, sends the content back
to the browser.
The browser gets the page content and
makes some ajax request to your site.
Those ajax scripts also have access
to the same session your main page
did, which allows you to check for a
key.
This insures only authenticated browsers are allow to make the ajax requests.
Don't count on the ajax request being able to write to the session though. With many requests being satisfied at the same time, the last one in will be the last one written back to your session storage.
http://us.php.net/manual/en/book.session.php
A lot of open source applications use a variation of this on top of every php file:
if (!defined('SOMETHING')) {
die('only scripts have direct access');
}
Then in index.php they define SOMETHING:
define("SOMETHING", "access granted.");
edit: I'm not saying this is a good approach btw
edit2: Seems I missed the part about it being an ajax request. I agree in this case this isn't a solution.

Prevent direct access to a PHP page

How do I prevent my users from accessing directly pages meant for ajax calls only?
Passing a key during ajax call seems like a solution, whereas access without the key will not be processed. But it is also easy to fabricate the key, no? Curse of View Source...
p/s: Using Apache as webserver.
EDIT: To answer why, I have jQuery ui-tabs in my index.php, and inside those tabs are forms with scripts, which won't work if they're accessed directly. Why a user would want to do that, I don't know, I just figure I'd be more user friendly by preventing direct access to forms without validation scripts.
There is no way of guaranteeing that they're accessing it through AJAX. Both direct access and AJAX access come from the client, so it can easily be faked.
Why do you want to do this anyways?
If it's because the PHP code isn't very secure, make the PHP code more secure. (For example, if your AJAX passes the user id to the PHP file, write code in the PHP file to make sure that is the correct user id.)
As others have said, Ajax request can be emulated be creating the proper headers.
If you want to have a basic check to see if the request is an Ajax request you can use:
if($_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest') {
//Request identified as ajax request
}
However you should never base your security on this check. It will eliminate direct accesses to the page if that is what you need.
It sounds like you might be going about things the wrong way. An AJAX call is just like a standard page request, only by convention the response is not intended for display to the user.
It is, however, still a client request, and so you must be happy for the client to be able to see the response. Obfuscating access using a "key" in this way only serves to complicate things.
I'd actually say the "curse" of view source is a small weapon in the fight against security through obscurity.
So what's your reason for wanting to do this?
If the browser will call your page, either by normal request or ajax, then someone can call it manually. There really isn't a well defined difference between normal and ajax requests as far as the server-client communication goes.
Common case is to pass a header to the server that says "this request was done by ajax". If you're using Prototype, it automatically sets the http header "X-Requested-With" to "XMLHttpRequest" and also some other headers including the prototype version. (See more at http://www.prototypejs.org/api/ajax/options at "requestHeaders" )
Add: In case you're using another AJAX library you can probably add your own header. This is useful for knowing what type of request it was on the server side, and for avoiding simple cases when an ajax page would be requested in the browser. It does not protect your request from everyone because you can't.
COOKIES are not secure... try the $_SESSION. That's pretty much one of the few things that you can actually rely on cross-page that can't be spoofed. Because, of course, it essentially never leaves your control.
thanks, albeit I use
define('IS_AJAX', isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest');
if(IS_AJAX) {
//Request identified as ajax request
}
cheers!
Not sure about this, but possibly check for a referrer header? i think if someone manually typed in your url, it wouldn't have a referrer header, while AJAX calls do (at least in the quickly test I just did on my system).
It's a bad way of checking though. Referrer can be blank for a lot of reasons. Are you trying to stop people from using your web service as a public service or something?
After reading your edit comments, if the forms will be loaded via ajax calls, than you could check window.location to see if the url is your ajax form's url. if it is, go to the right page via document.location
This definitely isn't useful for securing something.. but I think this could be of use if you wanted to have say a php page that generated a whole page if the page was not requested by ajax but only generate the part that you needed returned when ajax was used.. This would allow you to make your site non ajax friendly so if say they click on a link and it's supposed to load a box of comments but they don't have ajax it still sends them to the page that is then generated as a whole page displaying the comments.
Pass your direct requests through index.php and your ajax requests through ajax.php and then dont let the user browse to any other source file directly - make sure that index.php and ajax.php have the appropriate logic to include the code they need.
In the javascript file that calls the script:
var url = "http://website.com/ajax.php?say=hello+world";
xmlHttp.open("GET", url, true);
xmlHttp.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
then in the php file ajax.php:
if($_SERVER['HTTP_X_REQUESTED_WITH'] != "XMLHttpRequest") {
header("Location: http://website.com");
die();
}
Geeks can still call the ajax.php script by forging the header but the rest of my script requires sessions so execution ends when no valid session is detected. I needed this to work in order to redirect people with expired hybridauth sessions to the main site in order to login again because they ended up being redirected to the ajax script.

Categories