Javascript Cross Domain Authentication - php

I have a JavaScript which can be called externally using <script type="text/javascript" src="http://mydomain.com/myscript.js"></script> the script is created dynamically using php but I need to know where the script is being called from (which domain) the only way i can think off is using $ SERVER["HTTP REFERER"] but not all browsers support this and it is insecure as it can be changed.
Dose anyone know a better way I could do it?

First of all anything the browser provides cannot be trusted, this includes the HTTP Referer header.
However I don't agree with this being insecure, what exactly are you doing with this information? All the server can do is trust what the browser supplies it, so if you are attempting to restrict this javascript you are going to have to authenticate the user first (so you can plant a cookie).
So what exactly are your intentions?

Here is my idea.
Use a PHP file to render the JS file contents and it will only serve the javascript when session id matches. Hide your real js file too

Related

Get contents of DOM via PHP

I need to get the contents of a website through PHP, however, the content is only available when JavaScript is enabled. The workaround that I am using now is making an applescript to open the website in Safari, and selecting all of the page content, copying it to the clipboard, and pasting it.
That will be really hard to achieve I guess. If you observe the JS on that page that is responsible for getting the content ready, you may discover its just another AJAX call that you may be able to call directly from your PHP script.
best possible solution: ask the website owner for api/export access ;)
If that is not possible, you can only pray that you can analyze the requests that are initialized via JavaScript and imitate them.
(possible tools: firefox with firebug or tamper data plugin).
Warning the owner of the website might not like this approach, in fact, it may be disallowed to scrape the data automatically
What do you mean by:
the content is only available when JavaScript is enabled
Does the page pull data from somewhere via JS? Would it be easier to analyse where the data is coming from and access that place directly from PHP?

document.referrer - limitations?

I am unable to get a lot of referral URLS using document.referrer. I'm not sure what is going on. I would appreciate it if anyone had any info on its limitations (like which browser does not support what) etc.
Is there something else i could use (in a different language perhaps) that covers more browsers etc?
I wouldn't put any faith in document.referrer in your Javascript code. The value is sent in client side request headers (Referer) and as such it can be spoofed and manipulated.
For more info see my answer to this question about the server side HTTP_REFERER server variable:
How reliable is HTTP_REFERER
Which browser are you looking in? If the referring website is sending the traffic via window.open('some link') instead of a regular <a> tag, then IE will not see a referrer. It thinks it's a new request at that point, similar to you simply going to a URL directly (in which case there is no referrer). Firefox and Chrome do not have the same issue.
This is NOT just a javascript limitation, HTTP_REFERRER will NOT work either in this specific scenario.
Just to make sure you're on the same page, you do know that if someone types a URL directly in their web browser, the document.referrer property is empty, right? That being said, you might be interested in a JavScript method to get all HTTP headers. If you prefer PHP (since you're using that tag), the standard $_SERVER variable will provide what information is available. Note that the information is only as reliable as the reporting web browser and server, as noted by Kev.
The document.referrer will be an empty string if:
You access the site directly, by entering the URL;
You access the site by clicking on a bookmark;
The source link contains rel="noreferrer";
The source is a local file;
Check out https://developer.mozilla.org/en-US/docs/Web/API/Document/referrer

Fetching content from Website on another Server

What i basically want to do is to get content from a website and load it into a div of another website. This should be no problem so far.
The problem is, that the content that should be fetched is located on a different server and i have no source access to it.
I'd prefer a solution using JavaScript of jQuery.
Can i use a .htacces redirect to fetch the content from a remote server with client-side (js) techniques?
I will also go with other solutions though.
Thanks a lot in advance!
You can't execute an AJAX call against a different domain, due to the same-origin policy. You can add a <script> tag to the DOM which points at a Javascript file on another domain. If this JS file contains some JSON data that you can use, you're all set.
The only problem is you need to get at the JSON data somehow, which is where JSON-P callbacks come into the picture. If the foreign resource supports JSON-P, it will give you something that looks like
your_callback( { // JSON data } );
You then specify your code in the callback.
See JSONP for more.
If JSONP isn't an option, then the best bet is to probably fetch the data server-side, say with a cron job every few minutes, and store it locally on your own site.
You can use a server-side XMLHTTP request to grab your content from the other server. You can then parse it on you server (A.K.A screen-scraping) and serve-up the portion you want along with your web page.
If the content from the other website is just an HTML doc that you want to display on your site, you could also use an iframe to pull it in. You won't have access to any of its content because of browser security rules.
You will likely have to "scrape" the data you need and store it on your server.
This is a great tutorial on how to cache data from an external site. It is actually written to fetch and store XML, so it'll need some modification. Also, if your site doesn't allow file_get_contents then you may have to modify it to use cUrl.

Is there a way to get the full contents of the address bar in php or htaccess?

Specifically. I am making an ajax app and trying to preserve the back button. My javascript is working properly and registering a new url in the address bar with an anchor-like hash in the url:
http://t2b.localhost/#/clients/
I can catch the url when the page loads with javascript and load the "clients" page, but I want to know if there is a way to read the entire url with php or with htaccess? Looking at normal variables, I seem to only be able to get the url up to the occurrence of the "#" (http://t2b.localhost/).
The browser don't send to the server the fragment (the text after the #) part of the url.
It is intended to be used locally by the client.
In firefox (and in explorer too) there is document.location.hash that contains the fragment part of the URL. If you use javascript you can read it and send his value into a common variable.
Please use any of the available javascript libraries to track the history state or browse by ajax requests. There are so many problems involved, such as certain browsers not notifying scripts when the hash part changes, or not adding a pseudo-'navigation' event to the browser's history list etc., that you'll end up recreating an expensive wheel that wouldn't work very well. I recommend YUI's History library, although it has problems on Google Chrome.
I'm pretty sure that you can't parse it strictly with PHP because the hash part is parsed only on the client-side ( Javascript ).
For history I'd recommend Ben Alman's BBQ plugin.
See: Can I read the hash portion of the URL on my server-side application (PHP, Ruby, Python, etc.)?
You could use javascript and set a cookie as the current URL then get it with PHP

Best Practice: Legitimate Cross-Site Scripting

While cross-site scripting is generally regarded as negative, I've run into several situations where it's necessary.
I was recently working within the confines of a very limiting content management system. I needed to include database code within the page, but the hosting server didn't have anything usable available. I set up a couple bare-bones scripts on my own server, originally thinking that I could use AJAX to import the contents of my scripts directly into the template of the CMS (thus retaining dynamic images, menu items, CSS, etc.). I was wrong.
Due to the limitations of XMLHttpRequest objects, it's not possible to grab content from a different domain. So I thought iFrame - even though I'm not a fan of frames, I thought that I could create a frame that matched the width and height of the content so that it would appear native. Again, I was blocked by cross-site scripting "protections." While I could indeed load a remote file into the iFrame, I couldn't execute JavaScript to modify its size on either the host page or inside the loaded page.
In this particular scenario, I wasn't able to point a subdomain to my server. I also couldn't create a script on the CMS server that could proxy content from my server, so my last thought was to use a remote JavaScript.
A remote JavaScript works. It breaks when the user has JavaScript disabled, which is a downside; but it works. The "problem" I was having with using a remote JavaScript was that I had to use the JS function document.write() to output any content. Any output that isn't JS causes script errors. In addition to using document.write() for every line, you also have to ensure that the content is escaped - or else you end up with more script errors.
My solution was as follows:
My script received a GET parameter ("page") and then looked for the file ({$page}.php), and read the contents into a variable. However, I had to use awkward buffering techniques in order to actually execute the included scripts (for things like database interaction) then strip the final content of all line break characters (\n) followed by escaping all required characters. The end result is that my original script (which outputs JavaScript) accesses seemingly "standard" scripts on my server and converts their standard output to JavaScript for displaying within the CMS template.
While this solution works, it seems like there may be a better way to accomplish the same thing. What is the best way to make cross-site scripting work specifically for the purpose of including content from a completely different domain?
You've got three choices:
Create a server side proxy script.
Create a remote script to read in remote dynamic HTML. Use a library like jQuery to make this easier. You can use the load function to inject HTML where needed. EDIT What I originally meant for example # 2 was utilizing JSONP, which requires the server side script to recognize the "callback=?" param.
Use a client side Flash proxy and setup a crossdomain.xml file on your server's web root.
Personally, I would call to that other domain on the server and get and parse the data there for use in your page. That way you avoid any problems and you get the power of a server-side language/platform for getting and parsing the data.
Not sure if that would work for your specific scenario...hard to know even with your verbose description...
You could try easyXDM, by including very little code, you can pass data or method calls between documents of different domains.
I've come across that YDN server side proxy script before. It says it's built to work with Yahoo's Search APIs.
Will it work with any domain, if you simply trim the Yahoo API code out? Or do you need to replace it with the domain you want it to work with?
iframe remote content can be accessed by local javascript.
The remote server just have to set the document.domain of the page.
Eg:
Site A contain an iframe with src='Site B/home.php'
home.php looks like this :
[php stuff]...[/php]
[script type='text/javascript']document.domain='Site A'[/script]

Categories