How can I load an external page (ex: http://www.google.pt) to a div in my page?
I've tried html5 like this:
document.getElementById(id).innerHTML = "<iframe src='http://www.google.com' height='100%' width='100%'></iframe>";
but it doesn't load.
When I put a page from my domain, it properly loads.
How can I load an external page (from another domain) ?
There are some cross-domain restrictions because of which you cannot load any external site directly into an iframe on your page.
However you can try the following jQuery plugin for making cross-domain AJAX requests;
https://github.com/padolsey/jQuery-Plugins/blob/master/cross-domain-ajax/jquery.xdomainajax.js
Google does not let itself be loaded from within an iframe.
Loading this iframe gives this error : Refused to display document because display forbidden by X-Frame-Options.
Which means that Google doesn't allow you to do this. If you want to use Google Search on your site, you can use Google Custom Search.
You should really do this server-side, because including a page on the client-side can have bad effects. For a start, Google won't read a page loaded with an iframe.
It's a pretty cheap way to do things, does your server not support PHP?
Related
I have a php page where I'm trying to load and then echo and external page, (which is sitting in the same server but in complete different path/domain, if that matters).
I've tried using both file_get_contents() and curl. They both correctly load the html of the target page, the problem is that it's not displaying correctly because that target page has relative links to several files (images, css, javascript).
Is there any way I can accomplish this with PHP? If not, what would be the next best way? The target site must look like it's being loaded from the initial page (URL-wise), I don't want to do a redirect.
So, the browser would show http://example.com/initial-page.php even though its contents come from http://example2.com/target-page.php
EDIT:
This is something that could easily be done with an iframe but I want to avoid that too for several reasons, one of them is because with and iframe it breaks the responsiveness of the target site. I can't change the code of the target site to fix that either.
In the end, the solution was a combination of what I was trying to do (using curl) and what WebRookie suggested using the base html tag in the page being loaded via curl.
In my particular case, I pass the base URL as a parameter in curl and I echo it in the loaded page, allowing me to load that same page from different websites (which was another reason why I wanted to do this).
I have created an ajax driven website which can load any page when given the correct parameters. For instance: www.mysite.com/?page=blog&id=7 opens a blog post.
If I create a sitemap with links to all pages within the website will this be indexed?
Many thanks.
If you provide a url for each page that will actually display the full page, then yes. If those requests are just responding with JSON, or only part of a page, then no. In reality this is probably a poor design SEO wise. Each page should have it's own URL e.g. www.mysite.com/unicorns instead of www.mysite.com/?page=blog&id=1, and the links on the page should point to those. Then you should be using Javascript to capture all the link click events for the AJAX links, and then use Javascript how you like to update the page. Or better yet maybe try out PJAX which will load just the content of a page instead of a full page refresh speeding things up a little without really any changes from your normal site setup.
You do realize that making that sitemap all your search engine links will be ugly.
As Google said a page can still be crawled with nice url if you use fragment identifier:
<meta name="fragment" content="!"> // for meta fragment
and when you generate your page by ajax append the fragment to URL:
www.mysite.com/#!page=blog-7 //(and split them)
The page should load content directly in PHP by using $_GET['_escaped_fragment_']
As I've read that Bing and Yahoo started crawling with same process.
So the latest version of Chrome and maybe Safari too doesn't let you load in iframes inless you're on the same domain for certain websites that don't want to be iframed like google.com.
<iframe src="http://www.google.com" style="width:600px;height:500px;" frameborder="0"></iframe>
The Error I get in Chrome is..
Refused to display document because display forbidden by X-Frame-Options.
Is there a workaround to avoid this error and display the iframe.
From what I read the x-frame stuff is to prevent click jacking via iframe, but I'm not trying to do that I just want to load in the site the way the iframe I thought was supposed to perform.
Is there away around this using PHP?
No, there is no way to bypass this restriction.
So I understand this may come across as a open question, but I need away to solve this issue.
I can either make the site do ajax request that load the body content, or I can have links that re-loads the whole page.
I need the site to be SEO compliant, and I would really like the header to not re-load when the content changes, the reason is that we have a media player that plays live audio.
Is there away that if Google BOT or someone without ajax enabled it does the site like normal href but if ajax or javascript allowed do it the ajax way.
Build the website without JS first, ensure it works as wished, each link linking a new unique page. Google parses your site without JS, so what you see with JS off is what he sees.
Then add the JS, with click handlers to prevent the default page reload and do your ajax logic instead. You could use JQuery and .load() to do this quite easily.
Other solution, you could use the recommended Google method ( https://developers.google.com/webmasters/ajax-crawling/ ), but it's more work and less effective SEO-wise.
Or you can put your audio player in a iFrame...
I want to load external websites inside div and make it a bit smaller to accommodate inside div more properly.
just like Google search do
I tried this:
$("#targetDiv").load("www.google.com");
but it is not working.
I tried iframe but it has still 2 problems:
scrolling is still enabled by pressing arrow keys & PGUP PGDOWN
how to make contents inside iframe smaller
Don't know which method i should use
which is more optimized
or any alternative?
What you're trying to do is not going to work. Unfortunately, JavaScript isn't allowed to make cross-domain requests for security reasons (reference: http://en.wikipedia.org/wiki/Same%5Forigin%5Fpolicy).
If you create a script written in PHP that resides on your own server that submits the request, that could work but the user wouldn't have a valid session and there's a risk that the URL (links) from the other site won't work if they're relative.
Example:
$('#targetDiv').load('load.php?url=www.google.com')
You could also have a look at jquery-crossframe. I've never used it but it claims to do what you're looking for.
The best option is to use an iframe element.
You are not going to be able to load a cross domain ajax call like that with jquery. from http://api.jquery.com/load/
Additional Notes:
Due to browser security restrictions, most "Ajax" requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, or protocol.
If iframe is not an option you can retrieve the data via an ajax call to a php page using curl.
Francois is right in that your ajax requests are restricted to same origin policy. That means you cannot load contents from other websites directly. What your are trying to achieve, however, is possible if your source supports JSONP. If you want to specifically load google search engine results check out Google Custom Search API