I'm aware that this question has been asked a number of time on SO and have looked over many of the posts but none seem to be as restrictive. I may not be able to do anything about this but I want to ask.
I need the content within an iframe to grow and shrink with the content. The main problem that I see is that javascript cannot be used because the site providing the framed content will only provide an iframe tag with a link to its content. There is of course no guarantee that the site where the iframe is placed will allow javascript so I don't see that as an option.
Is there another alternative to an iframe that will allow me to import dynamically growing and shrinking content that doesn't rely on javascript?
You can use PHP on the server-side to retrieve the content and then include it within the page. You can strip out the meta-data, and adjust the CSS and HTML with Xpath if you need to.
If it's a completely different bit of content for which adjusting the CSS and HTML won't work, then an iframe does remain about the only solution. If you have access to a server with a GUI, you could load the page and store the height in a database, and then set the height of the iframe on load.
Try this:
http://kinsey.no/blog/index.php/2010/02/19/resizing-iframes-using-easyxdm/
I believe it still works.
I'd say that with current technology and browsers, this just cannot be done. Not without JavaScript.
And it's not pretty even with JavaScript.
Related
Im currently building a practice responsive website, what I am doing is taking an exising website, building it up using twitter bootsrap js and css, meaning it will be fully responsive for mobile.
The issue is that there are some large carousels and images on the site. Ideally I would like to just completely remove certain elements, like a carousel for instance, and instead have the options within the carousel as a standard list menu.
It seems the main option is display:none based on media queries, but I am starting to foresee that I will run into big problems for loading time if the entire desktop site is still going to be loaded on the mobile, only elements hidden.
Are there ways to completely exclude html based on browser size? If anyone has any good links or articles that would be great. Or even just opinions, on whether there is actually need to exclude html or not.
Thank you
First off it is really good to see that although you're talking about display:none; you actually still want to display the content without the bells and whistles of the image. Well done you.
The next thing I would look at is if you don't want to load images for a mobile then why are you adding it for the larger sites. If the image isn't providing a function, assisting in explaining the content better, then why not just drop it for the desktop size as well?
If in fact it does help tell a story then you can include the images and some of the popular image services like adaptive images, hiSRC, or PictureFill which will serve the mobile version of the image first and replace with a larger image at higher viewports (but remember, there's no bandwidth test).
Finally, if you do want to serve some different content, then take the advice of fire around including more content with ajax. The South Street toolbox from Filament group can help you out, pay particular attention to the AjaxInclude pattern (it also has a link to the picturefill).
You could consider storing heavy data JSON-encoded, and then creating elements and loading them on demand like so
var heavyImage = new Image();
heavyImage.src=imageList[id];
Then you can append image element to a desired block. From my experience with mobiles this is more robust than requesting <img> via AJAX, since AJAX could be pretty slow sometimes.
You may also 'prefetch' images with this method (like 2-3 adjacent to visible at the moment), thus improving UX.
You could pull in the heavy elements via AJAX so they wouldn't sit on the page initially, making it load faster. You could decide to do the AJAX call only if the screen size is larger than X.
If you want you can use visibility:hidden, or if you use jQuery you can use
$(element).remove() //to remove completely
$(element).hide() //to hide
$(element).fadeOut(1) //to fadeout
I am looking for a way to create functionality, similar to when you post a link to the existed web-site in facebook. If this statement is rather ambiguous, I will try to elaborate.
When you paste your link and submit your post, facebook together with you link gives a small preview of the page, you are posting (text and may be a small image)
What are the ways to achieve this?
I read the similar post, but the thing is that I do not need an image so much, text will be sufficient.
Working in PHP, but language is not important, because I am looking for a high level idea.
Previously I was thinking about parsing content of the link with cURL but the thing is that in a lot of situations the text returned by facebook is not available on the page.
Is there other ways?
From what I can tell, Facebook pulls from the meta name="description" tag's content attribute on the linked page.
If no meta description tag is available, it seems to pull from the beginning of the first paragraph <p> tag it can find on the page.
Images are pulled from available <img> tags on the page, with a carousel selection available to pick from when posting.
Finally, the link subtext is also user-editable (start a status update, include a link, and then click in the link subtext area that appears).
Personally I would go with such a route: cURL the page, parse it for a meta tag description and if not grab some likely data using a basic algorithm or just the first paragraph tag, and then allow user editing of whatever was presented (it's friendlier to the user and also solves issues with different returns on user-agent). Do the user facing control as ajax so that you don't have issues with however long it takes your site to access the link you want to preview.
I'd recommend using a DOM library (you could even use DOMDocument if you're comfortable with it and know how to handle possibly malformed html pages) instead of regex to parse the page for the <meta>, <p>, and potentially also <img> tags. Building a regex which will properly handle all of the myriad potential different cases you will encounter "in the wild" versus from a known set of sites can get very rough. QueryPath usually comes recommended, and there are stackoverflow threads covering many of the available options.
Most modern sites, especially larger ones, are good about populating the meta description tag, especially for dynamically generated pages.
You can scrape the page for <img> tags as well, but you'll want to then host the images locally: You can either host all of the images, and then delete all except the one chosen, or you can host thumbnails (assuming you have an image processing library installed and turned on). Which you choose depends on whether bandwidth and storage are more important, or the one-time processing of running an imagecopyresampled, imagecopyresized, Gmagick::thumbnailimage, etc, etc. (pick whatever you have at hand/your favorite). You don't want to hot link to the images on the page due to both the morality of it in terms of bandwidth and especially the likelihood of ending up with broken images when linking any site with hotlink prevention (referrer/etc methods), or from expiration/etc. Personally I would probably go for storing thumbnails.
You can wrap the entire link entity up as an object for handling expiration/etc if you want to eventually delete the image/thumbnail files on your own server. I'll leave particular implementation up to you since you asked for a high level idea.
but the thing is that in a lot of situations the text returned by facebook is not available on the page.
Have you looked at the page's meta tags? I've tested with a few pages so far and this is generally where content not otherwise visible on the rendered linked pages is coming from, and seems to be the first choice for Facebook's algorithm.
Full disclosure upfront, I'm a developer at ThumbnailApp.com.
It's an JSON API service with an optional Javascript SDK which I think does exactly what you're after: It will parse a string to detect any urls and return the title, description and thumbnail of the asset. If the page has OpenGraph tags, it will use those for the image thumbnail. It's currently in private beta but we're adding more accounts each week.
If you feel that you really need a do-it-yourself solution:
Checkout the python based Webkit2Png and the headless browser PhantomJs. They can render webpages to an image (default size is 800x600), then you'll have to write some code to resize and crop the image like taswyn mentioned. Ideally you would then upload the resized image to Amazon S3 and then get it hosted on a CDN such as CloudFront.
To get the title and description, first get the URL content (cURL or whatever) and you will need to check the content-type header to make sure it's a webpage. If it is, you can then use a HTML parser such as the SimpleHTMLDOM PHP library to grab the title and description meta data. If you want it exactly like Facebook you will also need to check for any OpenGraph tags specifically the og:image tag.
Also don't forget about caching. The first render and description parsing can take a long time. Even if your site is fast, the webpage you're rendering could be slow and the best approach is to render / parse it once, then just save and return the resized image and meta data for subsequent requests. Depending on what your requirements are you may need to refresh the cached data every hour or you could get away with refreshing it once a day.
To do it yourself takes quite a bit of work and lots of server configuration. I feel using a 3rd party service is a better way to go, but obviously I have a biased opinion :)
I apologize for my english. Correct the title if it's necesary, please.
I want different pages of my web site to have different images in some background divs.
I'm using one CSS file for all the site.
For example, the supportingText div for the "about me" page has different image than "my project page".
Right now I use inside every pages (aboutMe.html, myProject.html) for any supportingText div a style attribute for this task.
And for example:
When I want to let surfers change entire web site style design I can change to a different CSS file of course.
But if my pages have some different images, as I explained before, should I change others html files to do it?
I know that probably there is a solution in some way like in php.
Consider that I don't know anything about php but if you explain in very easy way won't be a problem to understand!! ;)
Is there a solution in javascript?
I'm using only XHTML and CSS.
EDIT
Thank you!
You have been very kind.
I understood everything except one concept that I hope you can clarify.
I have many divs with background images, as:
title - myPicture - navigation - supportingText
All of those have a grey levels images.
I would like users can change aspect of the web site.
Something like three options:
black and white pages
green pages
orange pages
So in this case I need to set divs images in a JavaScript script into html file as your example!
So the CSS file only is helpfull for dimension and all other staffs but not anymore for images background? Is it?
Thank you a lot!
I agree with John in that you really shouldn't hardcode style attributes into the HTML. But if for whatever reason you have to you can easily change them using JavaScript.
document.getElementById('supportingText').style.backgroundImage = 'url("image.jpg")';
or if you're using jQuery
$('#supportingText').css('background-image', 'url("image.jpg")');
Update:
Assuming you're using a button to trigger the change, the code (which you would put in your HTML document) would look something like this:
<script type="text/javascript">
function changeBackground(divId, newImage) {
document.getElementById(divId).style.backgroundImage = 'url("' + newImage + '")';
}
</script>
<button onclick="changeBackground('supportingText', 'image.jpg');">Change Background</button>
Essentially what you're doing here is creating a button which calls a JavaScript function (changeBackground()), and you're passing in the ID of the div that you want to change, and the name of the image file that you want to change it to. You could have multiple buttons with different values in the 'supportingText' and 'image.jpg' parameters.
Answer to Part 2:
You can apply styles via CSS or JS (as above). However anything you do in JS will override your CSS. It's really up to you how you divide it up.
If you have hard coded your styles, even just some of them, into your HTML and you want these values to change when someone chooses a new stylesheet then you're in a tough position. Hardcoding those styles into your HTML makes your code inflexible and leaves you unable to make things such as switching stylesheets dynamically easy either with PHP or JavaScript.
Your best bet is to remove the hardcoded style rules from your HTML and place them into your stylesheet. Then switching the stylehseet can be done easily with PHP or JavaScript.
If you can't remove them then you'll need to write your own JavaScript to do this which will be tedious as it will need to dynamically alter each page when it loads. Not only will this probably take a long time to do and be error prone but it can hurt your website's performance if it isn't done well.
Basically, by hardcoding style rules into your HTML you've tied your own hands and have no good options available to you. I recommend removing the hardcoded styles and making sure you avoid doing it again in the future. Not only to avoid issues like this but to make your site faster by allowing your CSS to be cached.
I am scraping a website with HTML with php that retrieves a page and removes certain elements to only show a photo gallery. It works flawlessly for every browser BUT any version of IE (typical ;)). We can fix the problem by rewriting the .css file, but we cannot implement it into the head of the php as this will be overwritten by the .css file from the websites server. How would we go about hosting our own version of the .css file so that our website will be displayed using OUR version? Would be swap something out with a filter?
Cheers!
You do realize that it may not really be a scraping problem? It's sounds like a straightforward page display problem.
Worrying about scraping might be a red herring. After you have scraped you have some HTML (and possibly some CSS) ... does that validate at W3C? I realize that is no guarantee, but it is an indicator (I know that IE doesn't always display valid pages properly, but sometimes it's a "gotcha" when other browsers seem to display invalid HTML/CSS properly).
If it's valid then maybe you should look back at your scraping. If you already removes certain elements to only show a photo gallery then maybe you can also remove the CSS from the HTML header (or wherever) and reaplce it with your own?
If you're already scrapping the website, why not just use PHP to omit their CSS file and write your own in its place? Alternatively, you could write your own CSS file just below their's in the <head> so it overwrote their styles.
This is just another thing to check, but if one of the elements you're removing is comments, you could unwittingly be pulling out ie only stylesheets that are between conditional comments. Another thing to look at is paths. Maybe one of their stylesheets has a relative path that you can't call from your server. You would need to make that an absolute path for it to work.
Really, you should probably take a close look at the source of the original page and your formatted source side by side. You could be pulling out something that's should be left in.
You ask how you could remove their css... you do it the same way you remove the other elements you're pulling out. Just pull out style tags and tags that link to stylesheets.
Aside from that I would just write some styles to fix it and stick it them anywhere after the existing css is called. (Like every one else here mentioned)
Just add another CSS header and mark your styles as !important to override the original ones?
i'm looking for a way to display a web page inside a div of other web page.
i can fetch the the webpage with CURL, but since it has an external stylesheet when i try to display it, it appears without all his style properties.
i remember facebook used this technique with shared links (you used to see the page that was linked with a facebook header)
did some unsuccessful jquery tests but I'm pretty much clueless about how to continue..
i know this can be done with frames but i always here that it's good practice to avoid frames so i'm a bit confused
any ideas how to work this out?
If you want to display the other website's contents exactly as they are rendered in that site then frames are, in this case, the best (easiest) way to go.
Facebook and Google both use this technique to display pages while maintaining their branding / navigation bar above the other site.
I am going to guess that Facebook still used an iFrame, just with no borders and a well placed header outside of it. The reason I am guessing that is because if the outside page has its own style sheet, there is a high probability that your styles and their styles will clash and not show things properly.
In order for the styles not to clash everything on both ends would have to be extremely detailed, not just generic styles applied to all paragraphs etc...
i agree that using frames would probably be the best solution for you problem.
but if you still want to avoid frames and put the contents into a div with the id externalConent, you could request the stylesheets the same way you get the other contents and prefix every rule in them with "#externalContent ". save these stylesheets to your server and include them in your page. with a few more customizations, that should work.
i have to admit this solution does sound quite strange... well, it is.
but it's the only way i see to do what you're asking for.
If you are unable to use a frame or iframe, try:
extract the HTML inside the BODY and inserting it into the destination DIV
extract the and sections at the header
Is not very clean though, but it will definitely work, you can insert a phpBB forum into another dynamic way using this technique, take a look at http://www.clearerimages.com/forum/ for an example.