I'm optimizing my website according to Google's site optimization standards:
http://code.google.com/speed/page-sp...mageDimensions
For those who are familiar with it, I'm using Firebug -> "Page Speed" tool to analyse my site's 'weak' areas.
About the above link, image dimensions, the question is - I have lots of dynamic images on my site that are uploaded via a CMS and therefore vary in height/width. Therefore, how important is it to specify image dimensions? If a page had say 5-10 images from the CMS, which option below is better:
a) Don't specify image dimensions
b) Use PHP's getimagesize function to get the image dimension dynamically, and put it in the IMG tag as a "width" and "height"
c) Update our database to store the width/height per image (I'm already querying other info from the image table in my database, such as the text for the "alt" attribute) and then access those columns for the IMG tag on the front end?
I think "c" is the best option but I'd like to hear if anyone has any recommendation or statistics on which option is better. Obviously "c" means we need to query more data from the database.
We're also using a separate database server, and making use of browser caching. (http://code.google.com/speed/page-speed/docs/caching.html)
Thanks
c
Setting the image size will give you
a better page load behaviour.
If you do not set the size some
parts of the page will resize while
the page loads, and that is just
plain ugly.
If you already query your database
for the alt text, I doubt you will
notice any performance problems if
you load two more columns.
I actually tried option B on a large page as a test and found it created massive performance issues. I would say that you're best to run with option C. Update your CMS to add these dimensions with each image added and maybe run a cron during a quiet time to update all existing images without dimensions
Related
Since multiple requests can slow down the speed at which a site loads, I was thinking that in the case of a gallery, would one large image containing all the thumbnails be better than loading individual thumbnails?
The large image would then use PHP to "chop up" the thumbnails and place them in the relevant locations on the page.
My main concern is would this have a negative impact on SEO? Since Google would only see one large image file, instead of many smaller ones. Would a way around this be to set the src of all thumbnails to redirect to the script that handles the thumbnail generation, where the image file name refers to a particular set of coordinates for that image?
As a rule of the thumb; for buttons/icons/stufflikethat use image sprites (one large image combining all images, uses css to only show a part between specific coordinates), for 'real' content images, just use separate images.
This has several reasons; icons, buttons and so on are images that appear on sometimes every page of your site and often multiple times on the same page. So it is really useful to combine them, as ie. it is really inefficient to start a new http connection to download an icon of 1kb (or less), imagine what will happen if you use hundreds. Furthermore this type of images are not important at all for your seo rank, only for the look of your site (but google doesn't care if your site is ugly as hell or beautiful as a princess)
But on the other hand, 'content' images, such as thumbnails, photo's of your holiday or baseball tournament are often big enough to rule out the efficiency part. As you can see in the chrome developer tools or firebug the browser will start the download of all images simultaneously. So downloading one image is pretty much as fast as downloading a hundred. But if you combine a hundred images, the download will be slower, as you have to download a larger bit of data in one piece. In comparison; pushing 2 gallons of water trough one hose will take longer than pushing the same 2 gallons trough 10 hoses. (offcourse this metaphore has it's holes, but it illustrates my point).
But more importantly; google reads out the img tags and uses the filename (src), the title and (less importantly) the alt attributes to determine how your image should relate to your seo rank. Images do have a relevant effect on your seo rank! But google also knows if it is the same image showing, or a different one, so a sprite wouldn't help you here. A script, with parameters saying which part of the image has to be loaded wouldn't help you at all, I believe if you think it over you can figure out why ;)
So don't bother about merging the thumbnails and stuff like that. If you want to improve speed, move your attention to caching and speeding up transmission. Some really simple improvements can be implemented by using for example gzip compression (google .htaccess gzip for that), proper caching headers etc.
You got it right, it is always better to download one large image and get all the image from there, i guess you meant javascript with the chop out thing, because you have to do that on the client side. This in terms of performance is a very good idea and lot of sites do it. Another idea is to use smaller images and resize them in the client side.Just be careful with the resizing not affecting the resolution of the image.
Im not so sure about this being negative to SEO, and as far as i know google doesnt execute any javascript function, so that work around, i dont think it would work. But in all honesty im not so sure about this, in my last job we never considered images as a big impact in SEO.
I'm building a web app where users can store links, with 200x200 pictures associated. By default, I'd like to crawl the link for images and then return thumbnails of the biggest ones (from which the user can select the "official" thumbnail). I want this all to happen via AJAX. My question is: what is the best way to do this?
Currently, I'm using the PHP Simple HTTP Parser to scan a URL. I then find the src attribute of all the <img> tags, use getimagesize to store the image size located at that URL, sort the array from biggest to smallest and return the top 5 biggest image URL's via AJAX to the client. Then the client sends a different AJAX request for each one which makes a server-side ImageMagick script download and cut the image to a thumbnail, save it in a temporary folder and then return the URL of this thumbnail, which the client finally loads on his browser.
Needless to say, this is a little complicated and probably really inefficient. Running this process on http://en.wikipedia.org takes about 10-15 seconds from start to finish. I'm not certain there are any more efficient ways, however.
I'd do it in one AJAX request, with the script automatically resizing the biggest 5 images on the first pass, saving them, and returning a JSON array with the resized image URLs for the client.
You should probably use PHP's DOMDocument class to grab/parse the html page.
getimagesize() means you have to download each image and process them. Perhaps you should consider simply showing the user ALL images, by simply placing img tags that link back to the original HTML page. You could size these however you like using the tags. This way you do not have to download/process a single image until the user has actually selected one for the thumbnail.
Interested if / how you solved this?
In the end I looped through the images doing getimagesize() until both height and width were over a certain size, and then broke the loop.
This way its slightly more efficient as it only downloads as many images as it needs
I have read a few similar questions and answers, but none completely address my issue.
Here is My Scenario:
I have what is similar to a tinyMCE (a home-brew version though) kind of editor. It lets users enter some text, and an image or two, etc. I have code that takes the items in there and renders them into a smaller div (what is essentially a thumbnail) in real time.
Here is What I Want to Do
Ultimately, the user may want to use their 'page' somewhere else, so I would like to let them go to a screen, view thumbnails of each page, and pick one.
Here is the Problem
Obviously, I could just use the same thumbnail code to render each page thumbnail. However, it can be bandwidth intensive (each page could have several images, not to mention the calculation would have to be performed many times - we are talking perhaps 40 to 50 thumbnails on a preview page).
So, I wanted to try to take the thumbnail div, and somehow create a png or jpg when they save the page in the editor (so the code for the page, and also a thumbnail image), and push it up to my PHP script to save the image to the server.
My first thought was that maybe canvas could do that, but there is the issue of translating the text and images onto the canvas first, which may or may not be possible.
So there it is. I am interested in any and all options, including commercial libraries if available that will do this -- only thing is, would like it to be in javascript.
You may want to look at:
http://html2canvas.hertzen.com/
A similar question was already asked:
Screen Grab with PHP and/or Javascript?
So I just put up a website for my high school and realized a lot of images are stretched especially on this page
www.eriesd.org/central/central2/staff.php
What you be the best way to make the images not so stretched?
I was thinking of adding a div and adding background image with center center or 50% 50%. Also on the career and tech pages I noticed the info page doesn't load in IE but the other pages load fine has anyone else ever had this problem?
I'm basically getting the location in menu and option and calling an ajax request which loads 1 of the 3 layouts which connects to my database and gets information depending on the option and location.
Assuming that the pictures are uploaded using some kind of php CMS, the first thing I would do, is process the images correctly at the moment they are uploaded: Apart from the bigger image, you would need to generate a thumbnail that fits the size you need for that page.
I would also recommend adding a notice to people who are uploading a picture, that this specific picture needs to have a landscape format as that is what you are using on the page.
CSS solutions would be my last resort to iron out small issues.
Edit: Apart from that I would seriously reconsider publishing all e-mail addresses like that and add some pagination as the page now takes a long time to load (especially with all the images being a lot bigger than you need them to be...).
They are stretched because you specified both width and height attributes for the <img> tag. If the actual image is of different dimensions, one can see how the browser has no options but to distort the image to make it fit the specified height and width.
Just don't specify either height, or width, or both, and the images are going to be ok.
You should set the height only on the img and add the width:200px;text-align:center CSS to the anchor if you want the white area either side. Omitting the width will shrink the whitespace around the image.
<a class="image" style="width:200px;text-align:center">
<img src="http://www.eriesd.org/central/central2/images/staff/kranking.jpg" alt="missing photo" height="112">
</a>
I'll answer your first question concerning images. The real problem is that your images are not sized to fit the space you want them to fill. One of them that I inspected was a 6MP (2848x2144) image weighing in at 1.5MB. There were many more of this approximate size and dimensin. Any one of those images is larger than the entire page should be by quite a lot. The first step is getting images to the size you need them to be. Your page is nearly 19MB. Not only so most browsers do a lousy job of scaling images, you're sending a ton of extra data and making the page load very slowly for users without very fast connections. Imagine a user with a mobile browser waiting on this and chewing through their data plan! A user with DSL might need several minutes; dial-up could require hours.
If you're uploading them manually, resize them first. Figure out a size constraint and resize and crop first. If you're using a CMS, find settings, plugins, or customize it to make a smaller thumbnail version and use it.
To keep the layout looking nice and equal, the only thing you can do is either stretch them as it is now, or, even better, crop the images a bit and resize them. You can probably do it programmatically for most of the pictures, just assume that the top center is where their head will be. You have stretched picture issues all over the site though.
As for the Career & Tech pages, they're still actually being loaded (at least in the latest IE) if you look at the source, but they're not being shown for some reason, so, either you have some CSS or JavaScript issues with .post or .content. It even pops up for a second sometimes and then disappears.
If you specify only a width, the height will be set proportionally and thus prevent stretching of your images.
I'm a big fan of Yahoo's recommendations for speeding up websites. One of the recommendations is to combine images where possible to cut down on size and the number of requests. However, I've noticed that while it can be easy to use CSS sprites for layouts, other image uses aren't as easily combined. The primary example I'm thinking of is a blog or article list, where each blog or article also has an image associated with it. Those images can greatly affect load time and page size, especially if they aren't optimized. What I'm looking for, in concept or in practice, is a way to dynamically combine those images while running them through a loss-less compression using PHP.
A few added thoughts or concerns:
Combining the images and generating
a dynamic CSS stylesheet to position
the backgrounds of the images might
be one way to go about it, but I
also worry about accessibility and
semantics. As far as I understand,
CSS images should be used for layout
elements and the img tag (with the
alt attribute) should be used for
images that are meant to convey
information. I could set the image
as a background to a div element and
substitute a title attribute for the
alt attribute, but I'm unsure about
the accessibility and semantic
implications of doing so.
Might the GD library be a good
candidate for something like this?
Can you recommend other options?
I wouldn't go down this route if I were you. Sure, you may save a few bytes in protocol overhead by reducing the number of requests, but this would more-tha-likely end up being self-defeating.
Imagine this scenario:
A blog site, whose front page has 10 articles at a time. Each article has it's own image associated with it. To save a byte or two of transfer time, you programatically create a composite image of all 10 article images. You now have one of two problems.
You must update the composite image each time a new post is made, as the most recent 10 images will have a modified set of content.
You decide to create a new composite each request, on the fly.
Obviously, #1 is preferable here, and would not be difficult to implement. However, what if a user searches for all posts tagged with the word "SQL"? You are unlikely to have a composite image of the first 10 results already created for this simple query, let alone a more complex one. Also, what happens if you want to update or delete an image? Once again you'd have to trigger the background creation of the composite.
How about an RSS aggregator, like Google Reader? It wouldn't have the required logic to figure out which portion of a composite image it would need to display, and would probably display the full image. (I mention Google Reader because I very rarely visit blog sites directly, tending to trust to an RSS aggregation service like Reader)
If it were me, I'd leave the single images alone. With modern connection speeds, the tradeoff between additional bandwidth overhead and on-server processing time is unlikely to win you and great gains.
Having said that, if you decide to go down this route anyway, I'd say the GD library is an excellent place to start.
You'd almost certainly be better off reducing the filesize of the images in articles, than combine them. I'd agree that there might be accessibility issues with the method you suggest. Also, I suppose it depends on what you mean by "dynamic" - if you're thinking of combining those images and generating CSS for each page load, you might well find that that results in slower page load times for users with average connection speeds.
As to your second point, GD could certainly handle that. A better use of GD for reducing page load times might be reducing the image quality of your article images to reduce filesizes, at article creation time, not at page load.