Since multiple requests can slow down the speed at which a site loads, I was thinking that in the case of a gallery, would one large image containing all the thumbnails be better than loading individual thumbnails?
The large image would then use PHP to "chop up" the thumbnails and place them in the relevant locations on the page.
My main concern is would this have a negative impact on SEO? Since Google would only see one large image file, instead of many smaller ones. Would a way around this be to set the src of all thumbnails to redirect to the script that handles the thumbnail generation, where the image file name refers to a particular set of coordinates for that image?
As a rule of the thumb; for buttons/icons/stufflikethat use image sprites (one large image combining all images, uses css to only show a part between specific coordinates), for 'real' content images, just use separate images.
This has several reasons; icons, buttons and so on are images that appear on sometimes every page of your site and often multiple times on the same page. So it is really useful to combine them, as ie. it is really inefficient to start a new http connection to download an icon of 1kb (or less), imagine what will happen if you use hundreds. Furthermore this type of images are not important at all for your seo rank, only for the look of your site (but google doesn't care if your site is ugly as hell or beautiful as a princess)
But on the other hand, 'content' images, such as thumbnails, photo's of your holiday or baseball tournament are often big enough to rule out the efficiency part. As you can see in the chrome developer tools or firebug the browser will start the download of all images simultaneously. So downloading one image is pretty much as fast as downloading a hundred. But if you combine a hundred images, the download will be slower, as you have to download a larger bit of data in one piece. In comparison; pushing 2 gallons of water trough one hose will take longer than pushing the same 2 gallons trough 10 hoses. (offcourse this metaphore has it's holes, but it illustrates my point).
But more importantly; google reads out the img tags and uses the filename (src), the title and (less importantly) the alt attributes to determine how your image should relate to your seo rank. Images do have a relevant effect on your seo rank! But google also knows if it is the same image showing, or a different one, so a sprite wouldn't help you here. A script, with parameters saying which part of the image has to be loaded wouldn't help you at all, I believe if you think it over you can figure out why ;)
So don't bother about merging the thumbnails and stuff like that. If you want to improve speed, move your attention to caching and speeding up transmission. Some really simple improvements can be implemented by using for example gzip compression (google .htaccess gzip for that), proper caching headers etc.
You got it right, it is always better to download one large image and get all the image from there, i guess you meant javascript with the chop out thing, because you have to do that on the client side. This in terms of performance is a very good idea and lot of sites do it. Another idea is to use smaller images and resize them in the client side.Just be careful with the resizing not affecting the resolution of the image.
Im not so sure about this being negative to SEO, and as far as i know google doesnt execute any javascript function, so that work around, i dont think it would work. But in all honesty im not so sure about this, in my last job we never considered images as a big impact in SEO.
Related
i am developing a cake-php application, under this i want to show powerpoint slideshow for the end user but the condition is that the user can only be able to see the show, not be able to download the slideshow.
Can any one plz suggest the best way to do it.
If the slideshow is based on images you can split each image into 9,16 or more squares and display the tiled image. That way if the user decides to 'Save-as' the image he will get only 1/9, or 1/16-th of the real image. If the slideshow is quite big it will be a pain to put together all the pieces and will discourage the users to try and save the slide.
You can see such implementation here - http://whatismycar.com/info/16540/ - the 4 images below the header are in fancybox and if you try to 'Save-as' one of them you will save only a small tile of the original image.
Hope this helps.
It is impossible prevent downloading images from internet, but you can make it hard for users with this. Also you can hide source html image path with php check it here
While I am no expert on this subject, something worth noting is what Youtube seems to be doing.
Ever notice how the whole video never loads if you pause it?
Upon monitoring the network tab during a video you will see that they are actually making hundreds or even thousands of requests for video segments from their server and most likely using JS to clear the cache of parts you've watched.
^^ this is why going back to an earlier point in the video causes it to stall for a bit while it re-downloads the segment which you wish to see.
At the end of the day, PrtScn will trump all of your efforts because the web browser does not have the privilege to control the keyboard outside of it's own environment.
I've been on a project for the past few days and hit a problem displaying large quantities of images (+20gb total ~1-2gb/directory)in a gallery on one area of the site. The site is built on the bootstrap framework. I've been trying to make massive carousels that ultimately do not function fluidly due to combined /images size. Question A: In this situation do I need i/o from a database and store images there-- is this faster than in /images folder on front end?
And b) in my php script i need to -set directories to variables/ iterate through and display images into < li >, but how do I go about putting controls on the memory usage so as to not overload browser? Any additions, suggestions, or alternatives would be greatly appreciated. Im looking for most direct means to end here.
Though the question is a little generic, here are some thoughts in regards to your two questions:
A) No, performance pulling images from a database would most likely be worse than pulling straight from the file system. In general, it is not a good idea to store images or other binary data in databases unless you absolutely have to, because databases can't do much with this information and you are just adding an extra layer on top of the file system that doesn't need to be there. You would, however, want to store paths to images in your database, potentially along with other characteristics such as image dimensions, thumbnail paths, keywords, etc. Then your application would read the entries for the images to return the correct paths to the images.
B) You will almost certainly want to implement some sort of paging if you are displaying many hundreds or thousands of photos. If the final display must be a carousel, you will want to investigate the Javascript that drives it to determine how you could hook in a function that retrieves more results from your PHP application via an AJAX call when it reaches the end or near end of the current listing of images. If you are having problems with the browser crashing due to too many images, you will also want to remove images from the first part of the list of <li>s when you load new ones so that it keeps the DOM under control.
A) It's a bad idea to store that much binary data into a database, even if the DB allows it, you shouldn't use it, it'll also give you much more memory consumption, all your data will be stored in the database's memory space, then copied into PHP's memory space for you to handle, which eats up twice the memory, plus the overhead of running a database server, and querying, etc.. so no, it's slower to use a database, accessing the filesystem directly is faster, if you also use varnish or other front-end caching system, you'll even be able to serve content much faster too.
What I would do is store files on the filesystem, and the best server to handle static serving like that is either G-WAN or NGINX Source, but do your read up and decide for yourself what suits you best. point is, stay away from apache, and probably host all those static files onto a separate server running a lightweight http server
ProTip: Save multiple copies of the same image with scaled down sizes for example 50% and another version with 25% of the original image size, this way you'll be able to send the thumbnails first for quick browsing, then when a user decides to view an image you serve up the 50% or 100% size, depending on their screen size, this way you save yourself bandwidth and memory. you also save a big 3G bill for mobile users.
B) This is where it makes some sense to use a database, you can index all the directories into a database, and use that to store the location of the image in the FS, and perhaps some tags, and maybe even number of views, etc...
and in the forntend you'll implement a scipt that'll fetch for example 50 thumbnails per page then the user can scroll around using some fancy JQuery, and when you need to fetch more, simply get a new result set with 50 more thumbs, etc..
this way you'll save yourself memory, bandwidth and even the users will thank you for such a lightweight browsing experience !
Another tip:
If you want to be able to handle bigger traffic, you might want to consider using a CDN, there are many CDN services that aren't as expensive as Amazon S3, a simple search will give you tons of resources !
Happy hacking !
I working on a site that displays images from a variety of other sites by having elements that link directly to the other sites. Some of the images are high resolution and some of there are not. In the case that many high res images are being displayed the website performance goes down. Is there anyway to display the externally linked images at a consistent (reasonable) resolution so that less bandwidth is consumed and site performance does not falter?
No, not really. If these are images out of your control and you're simply linking to them, there is not much you can do.
One option is to create some kind of pre-loader with a progress bar. You'd have to download all the images, process them into smaller images then display them. however, you're not really saving any time doing this since you'll still be downloading the images in the first place.
You just need a class name on the element that the images are going into. You probably have a bunch of div's with different sized images in them. You need <div class="mySize">...</div> and then, mySize {width:100px; height:100px;}.
In my website on listing pages I have to show small thumbnail of images which are used in detailed pages and they are larger in size. SO to display thumbnail in listing i am scaling them with height and width in <img> tag.
I know this is never good idea. because it makes page heavy and it takes time to load.
Is there any way that images get automatically cropped according to given height width ?
If you have PHP available, you can try phpThumb, which does all that for you and much more. It can crop, zoom-crop, transform, blur, contrast...etc etc, and it auto-creates the thumbnails and keeps them in cache so it doesn't have to re-crop...etc each time the image is loaded.
It's also VERY simple to install and use, which is a big plus.
You can't crop things client-side to make them lightweight because all the heavy lifting is already done (transferring the files). Not to mention it would be very intensive for your end users to be doing image manipulations. You will need to create thumbnails server side. You should post a question detailing what server side technology you are using (C#, php, etc). Ideally you would cache them or create them ahead of time so that you only do it once and save your server from unneeded work too.
Actually, don't post a followup question on what server side tech you are using. This has been asked a ton of times on SO. Search for how to do it. php thumbnail creation for example.
I'm a big fan of Yahoo's recommendations for speeding up websites. One of the recommendations is to combine images where possible to cut down on size and the number of requests. However, I've noticed that while it can be easy to use CSS sprites for layouts, other image uses aren't as easily combined. The primary example I'm thinking of is a blog or article list, where each blog or article also has an image associated with it. Those images can greatly affect load time and page size, especially if they aren't optimized. What I'm looking for, in concept or in practice, is a way to dynamically combine those images while running them through a loss-less compression using PHP.
A few added thoughts or concerns:
Combining the images and generating
a dynamic CSS stylesheet to position
the backgrounds of the images might
be one way to go about it, but I
also worry about accessibility and
semantics. As far as I understand,
CSS images should be used for layout
elements and the img tag (with the
alt attribute) should be used for
images that are meant to convey
information. I could set the image
as a background to a div element and
substitute a title attribute for the
alt attribute, but I'm unsure about
the accessibility and semantic
implications of doing so.
Might the GD library be a good
candidate for something like this?
Can you recommend other options?
I wouldn't go down this route if I were you. Sure, you may save a few bytes in protocol overhead by reducing the number of requests, but this would more-tha-likely end up being self-defeating.
Imagine this scenario:
A blog site, whose front page has 10 articles at a time. Each article has it's own image associated with it. To save a byte or two of transfer time, you programatically create a composite image of all 10 article images. You now have one of two problems.
You must update the composite image each time a new post is made, as the most recent 10 images will have a modified set of content.
You decide to create a new composite each request, on the fly.
Obviously, #1 is preferable here, and would not be difficult to implement. However, what if a user searches for all posts tagged with the word "SQL"? You are unlikely to have a composite image of the first 10 results already created for this simple query, let alone a more complex one. Also, what happens if you want to update or delete an image? Once again you'd have to trigger the background creation of the composite.
How about an RSS aggregator, like Google Reader? It wouldn't have the required logic to figure out which portion of a composite image it would need to display, and would probably display the full image. (I mention Google Reader because I very rarely visit blog sites directly, tending to trust to an RSS aggregation service like Reader)
If it were me, I'd leave the single images alone. With modern connection speeds, the tradeoff between additional bandwidth overhead and on-server processing time is unlikely to win you and great gains.
Having said that, if you decide to go down this route anyway, I'd say the GD library is an excellent place to start.
You'd almost certainly be better off reducing the filesize of the images in articles, than combine them. I'd agree that there might be accessibility issues with the method you suggest. Also, I suppose it depends on what you mean by "dynamic" - if you're thinking of combining those images and generating CSS for each page load, you might well find that that results in slower page load times for users with average connection speeds.
As to your second point, GD could certainly handle that. A better use of GD for reducing page load times might be reducing the image quality of your article images to reduce filesizes, at article creation time, not at page load.