I working on a site that displays images from a variety of other sites by having elements that link directly to the other sites. Some of the images are high resolution and some of there are not. In the case that many high res images are being displayed the website performance goes down. Is there anyway to display the externally linked images at a consistent (reasonable) resolution so that less bandwidth is consumed and site performance does not falter?
No, not really. If these are images out of your control and you're simply linking to them, there is not much you can do.
One option is to create some kind of pre-loader with a progress bar. You'd have to download all the images, process them into smaller images then display them. however, you're not really saving any time doing this since you'll still be downloading the images in the first place.
You just need a class name on the element that the images are going into. You probably have a bunch of div's with different sized images in them. You need <div class="mySize">...</div> and then, mySize {width:100px; height:100px;}.
Related
I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.
I've been on a project for the past few days and hit a problem displaying large quantities of images (+20gb total ~1-2gb/directory)in a gallery on one area of the site. The site is built on the bootstrap framework. I've been trying to make massive carousels that ultimately do not function fluidly due to combined /images size. Question A: In this situation do I need i/o from a database and store images there-- is this faster than in /images folder on front end?
And b) in my php script i need to -set directories to variables/ iterate through and display images into < li >, but how do I go about putting controls on the memory usage so as to not overload browser? Any additions, suggestions, or alternatives would be greatly appreciated. Im looking for most direct means to end here.
Though the question is a little generic, here are some thoughts in regards to your two questions:
A) No, performance pulling images from a database would most likely be worse than pulling straight from the file system. In general, it is not a good idea to store images or other binary data in databases unless you absolutely have to, because databases can't do much with this information and you are just adding an extra layer on top of the file system that doesn't need to be there. You would, however, want to store paths to images in your database, potentially along with other characteristics such as image dimensions, thumbnail paths, keywords, etc. Then your application would read the entries for the images to return the correct paths to the images.
B) You will almost certainly want to implement some sort of paging if you are displaying many hundreds or thousands of photos. If the final display must be a carousel, you will want to investigate the Javascript that drives it to determine how you could hook in a function that retrieves more results from your PHP application via an AJAX call when it reaches the end or near end of the current listing of images. If you are having problems with the browser crashing due to too many images, you will also want to remove images from the first part of the list of <li>s when you load new ones so that it keeps the DOM under control.
A) It's a bad idea to store that much binary data into a database, even if the DB allows it, you shouldn't use it, it'll also give you much more memory consumption, all your data will be stored in the database's memory space, then copied into PHP's memory space for you to handle, which eats up twice the memory, plus the overhead of running a database server, and querying, etc.. so no, it's slower to use a database, accessing the filesystem directly is faster, if you also use varnish or other front-end caching system, you'll even be able to serve content much faster too.
What I would do is store files on the filesystem, and the best server to handle static serving like that is either G-WAN or NGINX Source, but do your read up and decide for yourself what suits you best. point is, stay away from apache, and probably host all those static files onto a separate server running a lightweight http server
ProTip: Save multiple copies of the same image with scaled down sizes for example 50% and another version with 25% of the original image size, this way you'll be able to send the thumbnails first for quick browsing, then when a user decides to view an image you serve up the 50% or 100% size, depending on their screen size, this way you save yourself bandwidth and memory. you also save a big 3G bill for mobile users.
B) This is where it makes some sense to use a database, you can index all the directories into a database, and use that to store the location of the image in the FS, and perhaps some tags, and maybe even number of views, etc...
and in the forntend you'll implement a scipt that'll fetch for example 50 thumbnails per page then the user can scroll around using some fancy JQuery, and when you need to fetch more, simply get a new result set with 50 more thumbs, etc..
this way you'll save yourself memory, bandwidth and even the users will thank you for such a lightweight browsing experience !
Another tip:
If you want to be able to handle bigger traffic, you might want to consider using a CDN, there are many CDN services that aren't as expensive as Amazon S3, a simple search will give you tons of resources !
Happy hacking !
I've got an application I'm building with PHP which calls photos from a database according to however many images there are for a given plant species. I have the script resize the otherwise large photos to 100x100. This process REALLY takes a bite out of page load time and my computer's CPU gets up to 100% and is working quite hard.
I think it's because all images are loading at once... Is there a way to have them load only when the previous one is finished? Or is there a more efficient way of rendering images like this? Here is the snippet that loads 'em:
$imagesArray = explode(", ",$images);
unset($imagesArray[count($imagesArray)-1]); // get rid of the last array key which is blank.
echo '<tr><td>Images:</td><td>';
foreach ($imagesArray as $imgloc)
{
echo '<a target="_blank" href="plant_images/'.$imgloc.'"><img src="plant_images/'.$imgloc.'" width="100" height="100" alt="'.$row[2].'" title="'.$row[2].'" /></a> ';
}
Here is a screenshot of a partially loaded image in the page (this is a lot better than what happens other times! Seriously, some species have 10-12 images and my computer takes about 15 seconds to load the page, painfully)
http://www.captainscall.site11.com/temp_stuff/slow-img.png
I found this already, mildly helpful.
Thank you kindly,
Khanahk
The users browser will usually cache the images, so that the user will only experience slow loading the first time he/she visits the page.
However, you should consider having thumbnails of all the images that are being displayed. You don't need to make these thumbnails your self, in 2013 we have computers to do that.
A quick google search will probably give you some software that can make resized copies of all your pictures in no time, or if you know some coding you can make your own script to to this. You can for instance use Imagick in PHP (see http://php.net/manual/en/imagick.scaleimage.php). Then just store two sizes of the images on your server, one for thumbnails and one set with higher resolution.
One advantage of doing this is that you will decrease the outgoing data from your server (if your server has a lot of unique visitors, there will be a lot of traffic due to the size of the images). Also your users will experience less loading time for your site (as you said, waiting for images to load is boring).
You could probably tell the browser with javascript in what order to load the images, but that wouldn't solve your problem, which mainly is that your images are too big.
I think you should:
Serverside: you need to cache an image if it already generated, the next time you can use the cache version. You can create a physical image file for next used.
Client side: you can use library to lazy load your image
http://www.appelsiini.net/projects/lazyload
You can implement a JavaScript function that load image after some specific second.
Since multiple requests can slow down the speed at which a site loads, I was thinking that in the case of a gallery, would one large image containing all the thumbnails be better than loading individual thumbnails?
The large image would then use PHP to "chop up" the thumbnails and place them in the relevant locations on the page.
My main concern is would this have a negative impact on SEO? Since Google would only see one large image file, instead of many smaller ones. Would a way around this be to set the src of all thumbnails to redirect to the script that handles the thumbnail generation, where the image file name refers to a particular set of coordinates for that image?
As a rule of the thumb; for buttons/icons/stufflikethat use image sprites (one large image combining all images, uses css to only show a part between specific coordinates), for 'real' content images, just use separate images.
This has several reasons; icons, buttons and so on are images that appear on sometimes every page of your site and often multiple times on the same page. So it is really useful to combine them, as ie. it is really inefficient to start a new http connection to download an icon of 1kb (or less), imagine what will happen if you use hundreds. Furthermore this type of images are not important at all for your seo rank, only for the look of your site (but google doesn't care if your site is ugly as hell or beautiful as a princess)
But on the other hand, 'content' images, such as thumbnails, photo's of your holiday or baseball tournament are often big enough to rule out the efficiency part. As you can see in the chrome developer tools or firebug the browser will start the download of all images simultaneously. So downloading one image is pretty much as fast as downloading a hundred. But if you combine a hundred images, the download will be slower, as you have to download a larger bit of data in one piece. In comparison; pushing 2 gallons of water trough one hose will take longer than pushing the same 2 gallons trough 10 hoses. (offcourse this metaphore has it's holes, but it illustrates my point).
But more importantly; google reads out the img tags and uses the filename (src), the title and (less importantly) the alt attributes to determine how your image should relate to your seo rank. Images do have a relevant effect on your seo rank! But google also knows if it is the same image showing, or a different one, so a sprite wouldn't help you here. A script, with parameters saying which part of the image has to be loaded wouldn't help you at all, I believe if you think it over you can figure out why ;)
So don't bother about merging the thumbnails and stuff like that. If you want to improve speed, move your attention to caching and speeding up transmission. Some really simple improvements can be implemented by using for example gzip compression (google .htaccess gzip for that), proper caching headers etc.
You got it right, it is always better to download one large image and get all the image from there, i guess you meant javascript with the chop out thing, because you have to do that on the client side. This in terms of performance is a very good idea and lot of sites do it. Another idea is to use smaller images and resize them in the client side.Just be careful with the resizing not affecting the resolution of the image.
Im not so sure about this being negative to SEO, and as far as i know google doesnt execute any javascript function, so that work around, i dont think it would work. But in all honesty im not so sure about this, in my last job we never considered images as a big impact in SEO.
We are building a web app which will have a lot of images being uploaded. Which is the best Solution to optimize these images and store it in on the website ?
And also is there a way i can also auto enhance the images which are being uploaded ?
Do not store images in DB, store them in file system (as real files). You'll probably need to store information about them in DB though, e.g., filename, time of upload, size, owner etc.
Filenames must be unique. You might use yyyymmddhhiissnnnn, where yyyymmdd is year, month and date, hhiiss - hour, minutes and seconds, nnnn - number of image in that second, i.e., 0001 for first image, 0002 for second image etc. This will give you unique filenames with fine ordering.
Think about making some logic directory structure. Storing millions of images in single folder is not a good idea, so you will need to have something like images/<x>/<y>/<z>/<filename>. This could also be spanned across multiple servers.
Keep original images. You never know what you will want to do with images after year or two. You can convert them to some common format though, i.e., if you allow to upload JPG, PNG and other formats, you might store all of them as JPG.
Create and store all kinds of resized images that are necessary in your website. For example, social networks often have 3 kinds of resized images - one for displaying with user comments in various places (very small), one for displaying in profile page (quite small, but not like icon; might be some 240x320 pixels) and one for "full size" viewing (often smaller than original). Filenames of these related images should be similar to filenames of original images, e.g., suffixes _icon, _profile and _full might be added to filenames of original images. Depending on your resources and the amount of images that are being uploaded at the same time, you can do this either in realtime (in the same HTTP request) or use some background processing (cron job that continuously checks if there are new images to be converted).
As for auto enhancing images - it is possible, but only if you know exactly what must be done with images. I think that analyzing every image and deciding what should be done with it might be too complex and take too much resources.
All good suggestions from binaryLV above. Adding to his suggestion #5, you also probably want to optimize the thumbnails you create. When images are uploaded, they are likely to have metadata that is unnecessary for the thumbnails to have. You can losslessly remove the metadata to make the thumbnail sizes smaller, as suggested here: http://code.google.com/speed/page-speed/docs/payload.html#CompressImages. I personally use jpegtran on the images for my website to automatically optimize my thumbnails whenever they are created. If you ever need the metadata, you can get it from the original image.
Something else to consider if you plan to display these images for users is to host your images on a cookie-free domain or sub-domain as mentioned here: http://developer.yahoo.com/performance/rules.html#cookie_free. If the images are hosted on a domain or sub-domain that has cookies, then every image will send along an unnecessary cookie. It can save a few KB per image requested, which can add up to a decent amount, especially on restricted bandwidth connection such as on a mobile device.