jQuery getJSON() - What server is called? - php

When using PHP I can use file_get_contents or cURL to get a URL.
jQuery runs on the client
In jQuery there is a function called jQuery.getJSON(). Javascript is run on the client. What server is used for the download of the JSON code of the external URL? What information does the called URL know about? Does it know of the domain? The IP of the client user? It's a client language.
Prefered for many request
To make many requests, is it safer to do this with Javascript than PHP because it runs on the every client instead of one server point?

What server is used for the download of the JSON code of the external URL?
The one that the domain name in the URL passed to that function resolves to.
What information does the called URL know about?
It is an HTTP request, like any other. The usual information will be available.
Does it know of the domain? The IP of the client user?
Of course.
It's a client language.
… making an HTTP request.
To make many requests, is it safer to do this with Javascript than PHP because it runs on the every client instead of one server point?
You control the server. You don't control the client. JavaScript can be disabled. It is safer to make the request from your server.
(For a value of "safe" equal to "Less likely to fail assuming the service you are using doesn't impose rate limiting")

Because of the Same Origin Policy all requests made in JavaScript must go to the domain from which the document was loaded. It's a standard HTTP request, so the server will have the same information it would if a user was just navigating around (including cookies, etc.) From the phrasing of your question it appears you need to make requests to some external site, in which case making those requests from your server which is not subject to such a security policy would likely be best.

In jQuery there is a function called jQuery.getJSON(). Javascript is
run on the client. What server is used for the download of the JSON
code of the external URL? What information does the called URL know
about? Does it know of the domain? The IP of the client user? It's a
client language.
The code that runs your web browser is only on your PC, too, yet it is perfectly capable of retrieving content via the HTTP protocol from a web server, and has done so for several decades.
AJAX requests are no different. jQuery creates an XMLHttpRequest object that performs an HTTP request in a manner uncoupled from the general page context. As far as the server's concerned, it's just an HTTP request like any other.
The text contents of the result you get back happen to be written in JSON format, but the HTTP layer neither knows nor cares about that.

Related

can i make a call to http://api.example.com from https://example.com?

I am using django as backend API and ajax for making api call.my main site runs on https but the api on http . i am unable to make api calls from ssl cert loaded onto ngnix.
is it possible to make ajax calls from https to http ?
any leads will be appreciated ?
thnks in advance ..!!
The only difference between HTTP and HTTPS is the SSL security part, if your server is able to handle HTTPS requests they will be send through to the API just like any other HTTP request, it's only the actual data communication from the client socket to the server socket that is affected, once the data is received it's back in plain text (or it's original format) again.
Your browser will stop this and/or give an insecure warning and a padlock symbol for your HTTPS connection.
HTTPS indicates the site is secure, which gives certain guarantees to the visitor - namely that the site is for the given domain (authentication), that it's not been intercepted and changed (integrity) and that no one else is able to listen in to your messages to and from the server (confidentiality).
When you add an insecure resource like an api call, those guarantees are no longer there and so the browser will give a "insecure" warning, typically with a yellow warning padlock (instead of green) and/or a pop up.
Browsers used to differentiate between inactive content (e.g. images) - which were seen as less of a risk and so allowed, and active content (e.g. JavaScript) - which were potentially dangerous and so not allowed, however don't think they do any more. Even if they did Ajax XHR calls are definitely in the latter category.
Best option is to proxy pass the request through your main site domain through Nginx (e.g. forward requests to https://example.com/api from Nginx to your api using Nginx config).

PHP - Protect RESTful API requests

I use a JSON API to get data for a website. I am aware of various methods that I could make it secure, but my situation is different from common methods.
Because of cross domain issues, I had to create an API folder with various PHP files that do cURL requests to the REStful API. I then request these local PHP files through AJAX on my site. On the next release it should be JSONP to avoid this issue.
Many of these JSON requests contain sensitive information so the first thing I did was check for the HTTP Referrer so people don't just grab the URL when inspecting the JavaScript code and try to run it on their browser. This is obviously not safe nor should I rely on it.
Any data I may try to post to the request will be through JavaScript so something like an API key or token would be visible and would defeat the whole purpose.
Is there a way I can prevent these PHP files to be run outside the website or something? Basically make them inaccesible for visitors?
This does not have to do anything with REST. You have a server side REST client, in which you call the REST service with cURL and the browser cannot see anything of this process. Until you don't want to build your own REST service for this AJAX client this is just a regular webapplication (from the perspective of the browser and the AJAX client ofc.). As Lorenz said in the comment, you should use sessions as you would do normally. That's all. If you want to restrict access to certain pages, you can use an access control solution, e.g. role based access control is very common.

How to ensure the HTTP_REQUEST Is coming from the right place?

I learn that HTTP_REFERER or any HTTP request header can be fake and not reliable.
REMOTE_ADDR is reliable though.
so, how can I ensure the incoming HTTP_REQUEST call is coming from a website that I white-list?
For example, I have a js code that will send from client site to server. (something like a sniper, cross platform). however, I only allow this happen from several websites. Not others. so, even other people copy the code and put onto their website, it won't work.
In the general case you simply can't do it. You are entirely at the mercy of the client. You can make it more difficult by checking the referrer, but not impossible.
The only way to do this reliably is to have all those several websites generate unique tokens for every users, similarly as how you protect yourself from CSRF attacks. The tokens would then be sent along with the request by your script, and your server would need to have a way to check the token for authenticity against the other websites. Needless to say this is very likely impossible unless you control all sites.
See also this question on HTTP_REFERER
Haven't used this in practice, so there might be practicality issues I wasn't counting on, but thought I'd contribute the idea anyway. If I interpret correctly, this is similar to (if not the same as) the idea #Seldaek posted.
Your Server generates a unique ID for each page-serve and embeds the ID in the page.
Server stores the ID and the Client's IP address.
The js on the client places the ID in its request to the Server and sends the request.
When the Server receives the js request from the Client, it only responds if the IP/ID pair matches one that is on-file (see #2).
After some specified time (and/or when the browser session ends), the ID/IP entries expire.
This could perhaps be faked if a person sharing the visitor's IP address (perhaps both are behind the same NAT box) hijacks another visitor's session in real-time, but it will at least prevent someone from making another web page which piggybacks on your server's service.
There could also be issues if, for some reason, your visitor's IP address changes between when the page was served and when the js request was sent.
Basically, your server is saying "I will not service your js request unless you possess the data from a page I recently served and you are coming from (to the best of my knowledge) the place to which I served that page."
All http headers can be faked.
If you are just accepting communication from the remote server (and not having a client browser be redirected to your server) then you can either set up a VPN between that remote server and yours or you can change your firewall config to only allow communication from a specific set of IP addresses. However, even the later can be faked by people willing to go that far.
If the client browser is the one either being redirected to your server or loading the file(s) from your server then there is absolutely nothing you can do.
As #Billy says this simply isn't possible, you're thinking about the internets' request response mechanism incorrectly.
For example, I have a js code that
will send from client site to server.
(something like a sniper, cross
platform).
I assume what you're saying is that you have some javascript code served up on some website on your 'whitelist' which redirects the user to your website. Its on your website that you want to check that the user came from the 'whitelisted' site?
Aside from setting a cookie (might not be possible - cross domains) you might find it tough. Have you taken a look at OpenID? If you can post more details a solution may be more obvious.
so, how can I ensure the incoming
HTTP_REQUEST call is coming from a
website that I white-list?
I think if you sign every request(from whitelist) which is valid for that request only(once). I assume using uniqid for this is safe(enough?).

php cURL or file_get_content affect on google analytics

Im wondering what affect loading an external page with php has on a sites analytics. If php is loading an external page, and not an actual browser, will the javascript that reports back to google analytics register the page load as a hit?
Any JavaScript within the fetched page will not be run and therefore have no effect on analytics. The reason for this is that the fetched HTML page is never parsed in an actual browser, therefore, no JavaScript is executed.
Curl will not automatically download JavaScript files the HTML refers to. So unless you explicitly download the Google Analytics JavaScript file, Google won't detect the Curl hit.
Google offers a non-JavaScript method of tracking hits. It's intended for mobile sites, but may be repurposable for your needs.
You're misunderstanding how curl/file_get_contents work. They're executed on the server, not on the client browser. As far as Google and any regular user is concerned, they'll see the output of those calls, not the calls themselves.
e.g.
client requests page from server A
server A requests page from server B
server B replies with page data to server A
server A accepts page data from server B
server A sends page data to client
Assuming that all the requests work properly and don't issue any warnings/errors and there's no network glitches between server A and server B, then there is absolutely no way for the client to see exactly what server A's doing. It could be sending a local file. It could be executing a local script and send its output. It could be offshoring the request to a server in India which does the hard work and then simply claims the credit for it, etc...
Now, you CAN get the client to talk to server B directly. You could have server A spit out an HTML page that contains an iframe, image tag, script tag, css file, etc... that points to server B. But that's no longer transparent to the client - you're explicitly telling the client "hey, go over there for this content".

how to fetch a url with javascript/jquery?

i need to fetch a url with javascript/jquery and not php.
i've read that you could do that if you got a php proxy, but that means that it is still going through php. cause then it's still the ip of the server that is fetching it.
could one fetch the url entirely with only front-end, and thus fetch it with the client's ip?
There exists a Same origin policy for AJAX requests. This prevents Javascript on, say, this site, from making a request to gmail.com (with your cookies), reading your e-mails, and uploading them to the StackOverflow server. Javascript on stackoverflow.com can only make AJAX requests to pages on that domain.
As you can see, this is essential for security. Requests must instead be made by a proxy running on your web server - PHP can be used, but there are other solutions. For example, Ajax Cross Domain is an AJAX library that communicates with a Perl script running on the server to emulate AJAX requests for other domains.
It is also possible to make requests on other domains via a javascript include (script tag), image tag, etc. but in these cases you cannot read the contents of the page.
You cannot do this with an iframe either: scripts cannot see the internals of iframes unless they are on the same domain as the script.
So in short, use a proxy.
The problem is that jQuery would fetch an url with AJAX and AJAX won't operate cross-domain because of the potential security (as per the same-origin policy).
There are however ways to emulate this, if you load the page in an iframe you can retrieve the data by using innerHTML on the iframe. Here's an example script that uses jQuery: http://code.google.com/p/jquery-crossframe/

Categories