I am trying to access JSON from this location which is not available to the public (unless you are within the company firewall),
http://12.34.56.789:8983/app/collection/select?q=*%3A*&wt=json&indent=true
My application is located on this web server,
http://www.mywebapp.com
I know that running an AJAX call to a different domain is out of the question, so I was wondering what techniques I could apply to get that data?
JSONP is not an option as I do not have control to attach a callback to the data that is located on that private server.
Thoughts?
use file_get_contents().It reads out the raw data, and returns that as string.
Write a proxy script and put it on your domain. All it has to do is to get the data an respond it to you. Your domain will be the same, Ajax will work and no-one can see, where you get the data from. - of course its slower than a direct request.
So to give an official answer to the comment, JSONP is the thing you are looking for, but it's not without it's downsides though. You can find a good, short tutorial on it here:
JSON versus JSONP Tutorial (sorry don't care to re-write it).
Related
I want to fetch the response from adobe analytics which I get in network panel of browser and display in it the page. This there any way to read the response through php. I don't have access to core files. i work as third party implementation team.
The short answer is you can't. php is server-side. Assuming you did the standard javascript Adobe Analytics implementation; well that's client-side. Your server can't see that stuff.
The longer answer is redeploying Adobe Analytics through a server-side implementation. Basically you setup the code but have it point to your server and then you forward it to adobe (proxying), and so now it is exposed to you server-side.
The alternative answer is depending on what it is you are actually trying to accomplish, you can make use of Adobe Analytics' s.registerPostTrackCallback function. Basically it lets you register a callback function to be called after every s.t or s.tl call, and it gives you the full/final request URL sent to the AA collection server. You can then make an AJAX request to pass it to your server and do whatever with it. Or since you mentioned displaying it on page, maybe consider using javascript to render it on the page? But if you're looking for actual response stuff (headers, content) well you're out of luck on this option.
The other other alternative answer is.. this almost sounds like you are looking to make some kind of browser plugin? If so, then on a plugin/extension level, the request/response stuff (including header stuff) is exposed on that level. But again, ultimately this is really a client-side solution..
But first step back and more clearly define what it is you are trying to do. Or if you've done that already, then try (more clearly) to convey that here.
I've run into a situation where one of my company's clients is building a website with our service, but would like to include on our site the podcasts that get posted into a table dynamically generated on a page of their main business site.
I've done a bit with ajax before, I know one of the biggest hurdles is using ajax to access content on a site hosted on a different server. From my research I gather that JSONP is the best solution in a situation like this, but for argument's sake let's say I know nothing of how their server is configured (and have no realistic way to find out) and that I don't know much about JSON (which is true).
I probably shouldn't hope for a silver bullet in a situation like this, but can someone point me at least in the right direction?
Thanks!
Create your own service with PHP that calls the AJAX service, that way you can call any remote service you want but the ajax call is to your domain. I can provide an example if you like.
You can use curl in situations like this.
If you can use jQuery, have a look at jQuery AJAX cross domain, otherwise, throw one of the following header functions into script that serves the request and see if this helps.
header('Access-Control-Allow-Origin: *');
header('Access-Control-Allow-Origin: http://permitted_domain.com');
This is something that the client browser supports, so your mileage may vary
for a non AJAX/Javascript solution, URL fetching mechanisms like file_get_conents() (note this configuration) or using cURL can be used to achieve similar (if not more inline) results
Is it wrong to use POST for everything in REST? I know POST is for updating and creating but what if I use it for SELECTS and DELETS as well? What I basically need is a web service that can do CRUD operations to a database. Im using PHP. Also, what if I use query strings to do the POST request instead of using JSON or XML?
If you're using POST for all operations, you can't really call it REST anymore.
So it's not wrong, but it's not REST either :)
Yes, it's wrong. If you are using POST for anything other than updating you aren't actually following the REST convention.
You can eat soup with fork but why? Anyway you can use any method, just don't call it REST protocol.
Is it wrong to use POST for everything in REST?
You can't use POST for everything in REST (e.g. only for requests, not responses), so the question about wrong or right is not a valid question.
I know POST is for updating and creating but what if I use it for SELECTS and DELETS as well?
It normally just works.
What I basically need is a web service that can do CRUD operations to a database.
That's fine.
Im using PHP.
That's fine, too. PHP hast support for the HEAD, GET and POST method. For the PUT and DELETE methods you need to do a little bit more coding.
Also, what if I use query strings to do the POST request instead of using JSON or XML?
You use query strings instead of JSON/XML request bodies. REST does not specify the protocol, so you can do whatever pleases you. Using "query strings" might be a good idea with PHP as PHP supports those out of the box. But it depends what you use for a handler or what you want to support.
As your questions are very broad, you might want to read some basic description of REST first before so you can more specifically ask: A Brief Introduction to REST.
It's wrong to use POST for calls that are meant to be repeatable.
POST calls are never cached but your select calls should be cache-able (and repeatable) so you shouldn't be using POST for them.
But there's no technical reason why you can't force everything through POST, (particularly as DELETE isn't supported across all browsers), but it does mean that caching won't work for any of your calls, and also that your users may not be able to refresh a web page without being asked to confirm page reload if POST calls have been made to create it.
At this REST tutorial here the 4th article in the series says:
For creation, updating, and deleting data, use POST requests. (POST can also be used for read-only queries, as noted above, when complex parameters are required.)
Of course GET is usually used for read-only queries, so putting it all together, if complexity is the default assumption, which is kind of a handy assumption when no threshold between complexity and simplicity is stated, then this guy who is enough an authority to write a whole series on REST is saying that it's okay to POST everything. May be caches make some assumptions about responses to various HTTP verbs that might be beneficial but if your application does not benefit from caching (indeed, might be harmed by it, so disable it or do something so your application does not suffer from it) then I don't know if there's any problem at all.
Is it possible for a web page using Javascript to get data from another website? In my case I want to get it for calculations and graphing a chart. But I'm not sure if this is possible or not due to security concerns. If it is considered a no no but there is a work around I would appreciate being told the work around. I don't want to have to gather this information on the server side if possible.
Any and all help is appreciated.
Learn about JSONP format and cross-site requests (http://en.wikipedia.org/wiki/JSON#JSONP).
You may need to use the "PHP-proxy" script at your server side which will get the information from the websites and provide it to yours Javascript.
The only reliable way is to let "your" webserver act as a proxy. In PHP you can use curl() to fire a HTTP request to an external site and then just echo the response.
You can't pull data from another server due to the same origin policy. You can do some tricks to get around it, such as putting the URL in a <script> tag, but in your case it wouldn't work for just parsing HTML.
Use simple_dom_html, to parse your data server side. it is much easier than doing it in JavaScript anyways.
A simple way you might be able to do this is to use an inline iframe. If the web page you are getting the data from has no headers, or you can isolate the data being pulled in (to say an image or SWF), this might work.
cross-domain javascript used to be impossible, using a (php-)proxy was a workaround for that.
jsonp changes this entirely, it allows to request javascript from another server (if it has an API that supports jsonp, a lot of the bigger webplayers like google, twitter, yahoo, ... do), specifying the callback-function in your code that needs to be triggered to act on the response.
the response in javascript will contain:
a call to a callback-function you defined
the actual payload as a javascript-object.
frameworks like jquery offer easy support for jsonp out of the box.
once you have the raw data you could tie into google chart tools to create graphs on the fly and insert them in your webapp.
Also worth considering is support for XMLHttpRequest Access Control which is support in some modern browsers.
If the service provider that you are trying to access via a web page has this set up, it is a very simple call to XMLHttpRequest and you will get access to the resources on that site without the need for JSONP (especially useful for requests that are not GET, i.e. POST, HEAD etc)
I have a web application that pulls data from my newly created JSON API.
My static HTML pages dynamically calls the JSON API via JavaScript from the static HTML page.
How do I restrict access to my JSON API so that only I (my website) can call from it?
In case it helps, my API is something like: http://example.com/json/?var1=x&var2=y&var3=z... which generates the appropriate JSON based on the query.
I'm using PHP to generate my JSON results ... can restricting access to the JSON API be as simple as checking the $_SERVER['HTTP_REFERER'] to ensure that the API is only being called from my domain and not a remote user?
I think you might be misunderstanding the part where the JSON request is initiated from the user's browser rather than from your own server. The static HTML page is delivered to the user's browser, then it turns around and executes the Javascript code on the page. This code opens a new connection back to your server to obtain the JSON data. From your PHP script's point of view, the JSON request comes from somewhere in the outside world.
Given the above mechanism, there isn't much you can do to prevent anybody from calling the JSON API outside the context of your HTML page.
The usual method for restricting access to your domain is prepend the content with something that runs infinitely.
For example:
while(1);{"json": "here"} // google uses this method
for (;;);{"json": "here"} // facebook uses this method
So when you fetch this via XMLHttpRequest or any other method that is restricted solely to your domain, you know that you need to parse out the infinite loop. But if it is fetched via script node:
<script src="http://some.server/secret_api?..."></script>
It will fail because the script will never get beyond the first statement.
In my opinion, you can't restrict the access, only make it harder. It's a bit like access-restriction by obscurity. Referrers can be easily forged, and even with the short-lived key a script can get the responses by constantly refreshing the key.
So, what can we do?
Identify the weakness here:
http://www.example.com/json/getUserInfo.php?id=443
The attacker now can easily request all user info from 1 to 1.000.000 in a loop. The weak point of auto_increment IDs is their linearity and that they're easy to guess.
Solution: use non-numeric unique identifiers for your data.
http://www.example.com/json/getUserInfo.php?userid=XijjP4ow
You can't loop over those. True, you can still parse the HTML pages for keys for all kinds of keys, but this type of attack is different (and more easily avoidable) problem.
Downside: of course you can't use this method to restrict queries that aren't key-dependent, e.g. search.
Any solution here is going to be imperfect if your static pages that use the API need to be on the public Internet. Since you need to be able to have the client's browser send the request and have it be honored, it's possibly for just about anyone to see exactly how you are forming that URL.
You can have the app behind your API check the http referrer, but that is easy to fake if somebody wants to.
If it's not a requirement for the pages to be static, you could try something where you have a short-lived "key" generated by the API and included in the HTML response of the first page which gets passed along as a parameter back to the API. This would add overhead to your API though as you would have to have the server on that end maintain a list of "keys" that are valid, how long they are valid for, etc.
So, you can take some steps which won't cost a lot but aren't hard to get around if someone really wants to, or you can spend more time to make it a tiny bit harder, but there is no perfect way to do this if your API has to be publically-accessible.
The short answer is: anyone who can access the pages of your website will also be able to access your API.
You can attempt to make using your API more difficult by encrypting it in various ways, but since you'll have to include JavaScript code for decrypting the output of your API, you're just going to be setting yourself up for an arms race with anyone who decides they want to use your API through other means. Even if you use short-lived keys, a determined "attacker" could always just scrape your HTML (along with the current key) just before using the API.
If all you want to do is prevent other websites from using your API on their web pages then you could use Referrer headers but keep in mind that not all browsers send Referrers (and some proxies strip them too!). This means you'd want to allow all requests missing a referrer, and this would only give you partial protection. Also, Referrers can be easily forged, so if some other website really wants to use your API they can always just spoof a browser and access your API from their servers.
Are you, or can you use a cookie based authentication? My experience is based on ASP.NET forms authentication, but the same approach should be viable with PHP with a little code.
The basic idea is, when the user authenticates through the web app, a cookie that has an encrypted value is returned to the client browser. The json api would then use that cookie to validate the identity of the caller.
This approach obviously requires the use of cookies, so that may or may not be a problem for you.
Sorry, maybe I'm wrong but... can it be made using HTTPS?
You can (?) have your API accessible via https://example.com/json/?var1=x&var2=y, thus only authenticated consumer can get your data...
Sorry, there's no DRM on the web :-)
You can not treat HTML as a trusted client. It's a plain text script interpreted on other people's computers as they see fit. Whatever you allow your "own" JavaScript code do you allow anyone. You can't even define how long it's "yours" with Greasemonkey and Firebug in the wild.
You must duplicate all access control and business logic restrictions in the server as if none of it were present in your JavaScript client.
Include the service in your SSO, restrict the URLs each user has access to, design the service keeping wget as the client in mind, not your well behaved JavaScript code.