Best Practice: Legitimate Cross-Site Scripting - php

While cross-site scripting is generally regarded as negative, I've run into several situations where it's necessary.
I was recently working within the confines of a very limiting content management system. I needed to include database code within the page, but the hosting server didn't have anything usable available. I set up a couple bare-bones scripts on my own server, originally thinking that I could use AJAX to import the contents of my scripts directly into the template of the CMS (thus retaining dynamic images, menu items, CSS, etc.). I was wrong.
Due to the limitations of XMLHttpRequest objects, it's not possible to grab content from a different domain. So I thought iFrame - even though I'm not a fan of frames, I thought that I could create a frame that matched the width and height of the content so that it would appear native. Again, I was blocked by cross-site scripting "protections." While I could indeed load a remote file into the iFrame, I couldn't execute JavaScript to modify its size on either the host page or inside the loaded page.
In this particular scenario, I wasn't able to point a subdomain to my server. I also couldn't create a script on the CMS server that could proxy content from my server, so my last thought was to use a remote JavaScript.
A remote JavaScript works. It breaks when the user has JavaScript disabled, which is a downside; but it works. The "problem" I was having with using a remote JavaScript was that I had to use the JS function document.write() to output any content. Any output that isn't JS causes script errors. In addition to using document.write() for every line, you also have to ensure that the content is escaped - or else you end up with more script errors.
My solution was as follows:
My script received a GET parameter ("page") and then looked for the file ({$page}.php), and read the contents into a variable. However, I had to use awkward buffering techniques in order to actually execute the included scripts (for things like database interaction) then strip the final content of all line break characters (\n) followed by escaping all required characters. The end result is that my original script (which outputs JavaScript) accesses seemingly "standard" scripts on my server and converts their standard output to JavaScript for displaying within the CMS template.
While this solution works, it seems like there may be a better way to accomplish the same thing. What is the best way to make cross-site scripting work specifically for the purpose of including content from a completely different domain?

You've got three choices:
Create a server side proxy script.
Create a remote script to read in remote dynamic HTML. Use a library like jQuery to make this easier. You can use the load function to inject HTML where needed. EDIT What I originally meant for example # 2 was utilizing JSONP, which requires the server side script to recognize the "callback=?" param.
Use a client side Flash proxy and setup a crossdomain.xml file on your server's web root.

Personally, I would call to that other domain on the server and get and parse the data there for use in your page. That way you avoid any problems and you get the power of a server-side language/platform for getting and parsing the data.
Not sure if that would work for your specific scenario...hard to know even with your verbose description...

You could try easyXDM, by including very little code, you can pass data or method calls between documents of different domains.

I've come across that YDN server side proxy script before. It says it's built to work with Yahoo's Search APIs.
Will it work with any domain, if you simply trim the Yahoo API code out? Or do you need to replace it with the domain you want it to work with?

iframe remote content can be accessed by local javascript.
The remote server just have to set the document.domain of the page.
Eg:
Site A contain an iframe with src='Site B/home.php'
home.php looks like this :
[php stuff]...[/php]
[script type='text/javascript']document.domain='Site A'[/script]

Related

PHP include file based on screen size

<?php
include 'components/server.php';
Is it possible to make it include server.php for desktops and server-mobile.php for mobile devices?
While technically possible, it's absolutely not the best way of doing things.
Why?
Because PHP runs on the server and only the output of that PHP execution is given to the browser. You would probably be wanting something using javascript which can load and then seamlessly react to the browser conditions, such as screen size and/or dimensions.
If you're trying to change which PHP script is running based on the browser criteria (as mentioned above) this sounds very much like your programming logistics are simply wrong.
If you somehow really do need to change PHP script execution based on end-client (browser) characteristics you could do this by calling a script based on javascript AJAX or using mechanisms mentioned in comments above, but as said, you're almost certainly "doing it wrong".
Alternative
It would be far better to load everything you need in PHP and then pass all of that content to the browser (as output; HTML, CSS, Javascript, etc.) for the Javascript in the browser to then decide which parts of the data it needs to use and ignoring the others.

cURL PHP - load a fully page

I am currently trying to load an HTML page via cURL. I can retrieve the HTML content, but part is loaded later via scripting (AJAX POST). I can not recover the HTML part (this is a table).
Is it possible to load a page entirely?
Thank you for your answers
No, you cannot do this.
CURL does nothing more than download a file from a URL -- it doesn't care whether it's HTML, Javascript, and image, a spreadsheet, or any other arbitrary data; it just downloads. It doesn't run anything or parse anything or display anything, it just downloads.
You are asking for something more than that. You need to download, parse the result as HTML, then run some Javascript that downloads something else, then run more Javascript that parses that result into more HTML and inserts it into the original HTML.
What you're basically looking for is a full-blown web browser, not CURL.
Since your goal involves "running some Javascript code", it should be fairly clear that it is not acheivable without having a Javascript interpreter available. This means that it is obviously not going to work inside of a PHP program (*). You're going to need to move beyond PHP. You're going to need a browser.
The solution I'd suggest is to use a very specialised browser called PhantomJS. This is actually a full Webkit browser, but without a user interface. It's specifically designed for automated testing of websites and other similar tasks. Your requirement fits it pretty well: write a script to get PhantomJS to open your URL, wait for the table to finish rendering, and grab the finished HTML code.
You'll need to install PhantomJS on your server, and then use a library like this one to control it from your PHP code.
I hope that helps.
(*) yes, I'm aware of the PHP extension that provides a JS interpreter inside of PHP, and it would provide a way to solve the problem, but it's experimental, unfinished, would be still difficult to implement as a solution, and I don't think it's a particularly good idea anyway, so let's not consider it for the purposes of this answer.
No, the only way you can do that is if you make a separate curl request to ajax request and put the two results together afterwards.

how to check for mobile version of a website?

I would like to check if mobile version exists for a specific website or not. To my understanding, we cannot be sure if every website has mobile version located at http://m.example.com/ therefore I am testing through CURL() request. Here is how I am doing it:
* I send mobile browser headers in curl request, this returns contents of
the returning URL.
* If it has a mobile version, then it would return contents of a mobile version site.
* I then check if the content includes #media keyword, if it exists then I assume it has a mobile version.
The problem is, if its css loads externally then I will have to further send CURL() requests to the CSS files as well, which will make it even more slower. Is there any specific solution to my problem or can I boost this process a bit more?
Any help would be appreciated. Thanks.
The problem with your approach, which smells a bit like an XY Problem, is that it is simply unreliable.
The website has many choices for mobile websites, which include:
1. Using CSS media queries
The problem with this method is twofold. For starters, you would have to scan every single CSS file and <link> declaration. Secondly, the site can dynamically introduce stylesheets to the page using JavaScript, which you will never see using cURL because it lacks a JavaScript parser.
2. Browser sniffing using (client side) JavaScript, or screen width sniffing using JavaScript
Again, this JavaScript will never get executed, so you will never see that result.
3. Browser sniffing using server side code
Well, I guess you could try to use a mobile user-agent string with your cURL request, and see where that takes you, but all of these methods are hackish and unreliable.
4. The page could be mobile friendly from the get-go (credit to #Quentin)
As #Quentin mentioned in the comments, the page could be mobile friendly without any additional checks on the client/server side (responsive design without media queries, by simply using percentage-based values, for example).

Get contents of DOM via PHP

I need to get the contents of a website through PHP, however, the content is only available when JavaScript is enabled. The workaround that I am using now is making an applescript to open the website in Safari, and selecting all of the page content, copying it to the clipboard, and pasting it.
That will be really hard to achieve I guess. If you observe the JS on that page that is responsible for getting the content ready, you may discover its just another AJAX call that you may be able to call directly from your PHP script.
best possible solution: ask the website owner for api/export access ;)
If that is not possible, you can only pray that you can analyze the requests that are initialized via JavaScript and imitate them.
(possible tools: firefox with firebug or tamper data plugin).
Warning the owner of the website might not like this approach, in fact, it may be disallowed to scrape the data automatically
What do you mean by:
the content is only available when JavaScript is enabled
Does the page pull data from somewhere via JS? Would it be easier to analyse where the data is coming from and access that place directly from PHP?

Parse php then parse asp.net

Is it possible with IIS to serve a .php page, but that can also host asp.net control?
Like a two-pass parsing, the first pass parse the php, then the asp.net parser transform the controls.
The variable declared on each layer do not need to be shared.
Is it possible or is it a sin?
Ultra hack and untested... but maybe....
Use output buffering and have a PHP script that generates your control. Then, save the output to a file (with a unique name) and file_get_contents() to evaluate the ASPX file.
Regardless of whether or not ZenGeneral's creative hack works, its a sin and really, just don't go there!
Possible, but not recommended.
Pick one or the other.
In worse case scenario, you could use a php curl to grab the aspx or even (GRASP!) an iframe.
It puts a lot of unnecessary strain on the server, because it has to get the php file, parse it, then get the aspx file, parse it, then send it to php (or php grabs it), then renders the rest of the php file once the request goes through. It also would make the load times slower.

Categories