How to detect anonymous user like facebook does, when you open it throught proxy websites like hidemyass.com. I think its something related to proxy, but beyond I dont know anything about it, but I want to create that.
Most common way to detect proxy servers is by looking if these http headers fields are empty (if not, a proxy is used to access you're webserver):
HTTP_FORWARDED
HTTP:X-Forwarded
HTTP:Forwarded-For
HTTP:X-Forwarded-For
In PHP you can read these values with the getenv() function.
It is hard to say who is trying to hide his identity and who has limited/restricted access based on firewall rules etc. You can also check if user accept COOKIES by sending special token on first request and fetching on second.
Related
I've read a lot about .htaccess rules, checking headers, using encryption etc.. but I haven't found exactly the answer I'm after. I know that assuming the server is set up right, you can't access my precious PHP scripts with AJAX. I tried checking if an access variable was defined which disallowed address bar access but also blocked my AJAX requests.
If I have some PHP scripts that I use for AJAX calls, is there a way that I can prevent address bar access, PHP POST (cURL etc) as well as AJAX from outside my domain (assumed via cross-domain access restrictions) ?
There is NO way absolutely to safely/reliably identify which part of the browser the request comes from -- address bar, AJAX. There's a way to identify what is sending though browser/curl/etc via User-Agent header (but not reliably)
A quick but a lot less reliable solution would be to check for the following header. Most browsers attach it with AJAX calls. Be sure to thoroughly look into it, and implement.
X-Requested-With: XMLHttpRequest
NOTE: Do not trust the client if the resource is cruicial. You are better off implementing some other means of access filtering. Remember, any one can fake headers!
You can check whether the request isn't an Ajax request and forbid it, but it's not really safe due to the fact that the headers can be manipulated.
What you can do is to block every IP except the IP which is allowed to access those files.
What can do either is do implement a kind of authentication, where external applications have to send credentials to your script and the scripts checks if the client is valid.
Many ways, but they're all not really the best ways to achieve maximum security.
I do not know definitely. However – indirectly, you can do this. Pass a unique and constantly changing parameter (GET or POST) that only you have access to as proof of the origin. If the request lacks this unique variable, then its not from you. Think outside the box on this one. Could be anything you want, here are some ideas.
1) pass the result of a mathematical equation as proof of origin. Something that you can programmatically predict, yet not obvious to prying header hackers. i.e cos($dayOfYear) or even better base64_encode(base64_encode(cos($dayOfYear))).
2) store a unique key in a database that changes every time someone access the page. Then pass that key along with the request, and do some checks on the end page, if they dont match up to the database key, you've found the peeping tom. (note there will be logic involved for making sure the key hasn't changed in between transmission of requests)
etc..
Try to catch if isset SERVER['HTTP_ORIGIN'] from the POST access, it must be identical to your domain. If so, then the POST is generated by yourselft website and it's safe to process it.
I have created simple web service for my website that generates some json based on request, using php, but I want it to be protected so that only I can use it. I mean it should be available for my website only. No one without my permission should be able to use that json on their website.
What is the best method for that in php?
You could try using HTTP_REFERRER header field, but it's easily spoofed and therefore insecure.
How about using PHP sessions?
Set some variable in session in your main page script, then check for its existence when processing API requests; if the variable in session is not set, don't serve the content.
Give OAuth a try, it is widely used for this propose.
Only allow your server's IP to access the service. Or do you mean you're calling it from the browser?
Then you'd have to pass some kind of token to the service, proving that you're authenticated to call it.
Use a cookie to validate, this way you are independent from your ip address.
I have a loginform where users can login with two different accounts - one is a SolarisLDAP account and the other is an Active Directory account.
When the user tries to log in I want to find out which account he uses (which is not the problem).
If he's using the SolarisLDAP account, the authentication is done in PHP.
But if it is an AD account it must be passed to Apache (because I have to use the mod_auth_kerb to authenticate against our AD).
I wonder if this is possible in any way. Could I just set $_SERVER['PHP_AUTH_USER'] and $_SERVER['PHP_AUTH_PW'], or $_SERVER['REMOTE_USER'], and that's it?
Or would it be a possibility to kinda do it via the headers or a redirect?
Hope you understand what I'm trying to do..
Cheers
I wonder if this is possible in any way.
Unfortunately - no.
Also note that passing whatever variables from PHP to Apache makes no sense in that context.
It is not PHP but browser you want to authenticate with Apache, and, obviously, you have no control of.
This depends on which environment variables apache does set for the request. Do a
var_dump($_SERVER);
to get a list of all available ones. Go through that list and find out which ones are related to the authentication. It's probably something non-standard, this is a general list: http://php.net/_SERVER, compare it with your var_dump output.
Also this might depend on which Server-API (SAPI) you're using with PHPDocs.
In simplest terms, I utilize external PHP scripts throughout my client's website for various purposes such as getting search results, updating content, etc.
I keep these scripts in a directory:
www.domain.com/scripts/scriptname01.php
www.domain.com/scripts/scriptname02.php
www.domain.com/scripts/scriptname03.php
etc..
I usually execute them using jQuery AJAX calls.
What I'm trying to do is find is a piece of code that will detect (from within) whether these scripts are being executed from a file via AJAX or MANUALLY via URL by the user.
IS THIS POSSIBLE??
I've have searched absolutely everywhere and tried various methods to do with the $_SERVER[] array but still no success.
What I'm trying to do is find is a piece of code that will detect (from within) whether these scripts are being executed from a file via AJAX or MANUALLY via URL by the user.
IS THIS POSSIBLE??
No, not with 100% reliability. There's nothing you can do to stop the client from simulating an Ajax call.
There are some headers you can test for, though, namely X-Requested-With. They would prevent an unsophisticated user from calling your Ajax URLs directly. See Detect Ajax calling URL
Most AJAX frameworks will send an X-Requested-With: header. Assuming you are running on Apache, you can use the apache_request_headers() function to retrieve the headers and check for it/parse it.
Even so, there is nothing preventing someone from manually setting this header - there is no real 100% foolproof way to detect this, but checking for this header is probably about as close as you will get.
Depending on what you need to protect and why, you might consider requiring some form of authentication, and/or using a unique hash/PHP sessions, but this can still be reverse engineered by anyone who knows a bit about Javascript.
As an idea of things that you can verify, if you verify all of these before servicing you request it will afford a degree of certainty (although not much, none if someone is deliberately trying to cirumvent your system):
Store unique hash in a session value, and require it to be sent back to you by the AJAX call (in a cookie or a request parameter) so can compare them at the server side to verify that they match
Check the X-Requested-With: header is set and the value is sensible
Check that the User-Agent: header is the same as the one that started the session
The more things you check, the more chance an attacker will get bored and give up before they get it right. Equally, the longer/more system resources it will take to service each request...
There is no 100% reliable way to prevent a user, if he knows the address of your request, from invoking your script.
This is why you have to authenticate every request to your script. If your script is only to be called by authenticated users, check for the authentication again in your script. Treat it as you will treat incoming user input - validate and sanitize everything.
On Edit: The same could be said for any script which the user can access through the URL. For example, consider profile.php?userid=3
I want to store the page location the user came from (on my site). I want to do that for this example: say someone sent a comment without being logged in. "process_comment.php" will process it and send a header(location:$_GET['prev_page']); Of course I'm gonna filter $_GET before sending it.
Should I use a session instead?
Thanks!
It is actually exactly the same. Both methods imply that the information is passed in the HTTP query, which can easily be forged. So you can't really trust one method more than the other.
That being said, as long as you don't rely on that information for something really important, you can admit that the referer can be trusted, because it's a little bit more complex to forge than a querystring parameter. At least for the average user.
The best solution, if you need to trust that information for something important, would be to store it on the server, as a session variable for instance. Each page would store its URL, after checking what the previous value was.
If you use $_SESSION, there will be trouble if the user has multiple windows/tabs open and does different things at once. There is nothing more annoying than being able to only have window of a site.
You could store the value in a SESSION variable and identify it by a short key. That key goes into the GET string. That way, you can keep your URLs clean, and you don't risk hitting the 1024 byte limit many servers have for GET parameters.
Well, the HTTP_REFERER can be stripped out by some clients.. I seem to remember some Norton Internet security products did that, probably others do too. So it is going to be more reliable for you to set the previous page in a session and use that for redirecting.
If you can use it, session is a safer option. Sending user back from GET or even headers will allow crafty people to possibly abuse any flaws in your code to possibly do nasty things.
The header itself may also be removed by some firewall software.
I don't think there is a problem with using GET in this case. You can't always depend on being able to retrieve the referrer from the browser.