I have a products website where in I have around 100 images of high quality. Each image is around 6-7MB in size.
In my database I have stored the path of all the images along with their names. The images are saved in a folder /images/product_name/, But when I go to display these images in a web page, the page takes forever to load. All I do is send the id to the table, get the image paths and display it in the products page.
It would be very helpful if I could get any sort of advice on how to optimize the process.
The images you send to the client are most likely way too big. 7MB of size sounds very large for a product picture so 100*7MB = 700MB of data transfered if you display all the product images.
If you only need small images, scale them down to some KB's (thumbnails) and use those to display in your table.
NOTE: you can just preprend a prefix like "tmb_" or "tmb_200x200_" to the original filename, and you won't have to touch the paths in the database.
From reading your comments I think you are looking for some automated process to optimize the files and serve them at a more decent filesize.
You should take a look at imageMagick or the GD library which allows you to resize images (among many other things - http://php.net/manual/en/book.imagick.php) and optimize them. This can be combined with something like the YUI Image cropper to allow you to choose certain parts of the image to show in the thumbnail.
This is best done at the point of upload so the Server doesn't unnecessarily regenerate the images each time they are requested, and stored under a thumbnail column in the database.
If you must show bigger images I suggest you use a lightbox (see an example here - http://leandrovieira.com/projects/jquery/lightbox/) or similar technique that only loads the larger image when the client requests it.
Use http://luci.criosweb.ro/riot/ to compress images.
It is one of the simple and free tools recommended by Google to compress images.
Also as it was mentioned before make sure the thumbs you send to the listing pages are around 100px and only if the user clicks to zoom in show him the large ones one by one, and still even in the zoom option you should only display a 1000px size image max and that should not have more then 200-500k depending on format and quality.
Do you have optimized your images (in term of file size) ? Like on Smush.it for example http://www.smushit.com/ysmush.it/
You should really optimize your images. You can use a commercial product like Photoshop or use a free one like the GIMP to optimize the files.
Loading about 700 MBs of images will take exactly 1 day for me!!
reduce the size of the images to a thumbnail and when the user clicks on the thumbnail then load the original file. This can increase the performance.
Related
Google Page Speed Insights suggests me to optimize webpage images on a webpage I'm currently working on. Images are uploaded from a server. I want to display optimized images on the page but don't want the original image on the server to change. Is there any way to do this in PHP?
You should reduce image size while uploading. Try this code.
reduce image size while uploading using the following PHP code used to upload image
If you don't want the original image to change then you'll find any optimisations you perform to bring the file size down / serve a move appropriately sized image will be redundant due to the overhead of "optimising" the image on the fly. (Unless your original image is just that big)
As I see it, you have 3 options:
Unless it has a massive impact (or you can envisage it will have a massive impact) just ignore Google Page Speeds for now.
You can use lossless compression which will reduce the file size without reducing the quality of the image. This is something you can do on your server with various different apps (just google what type of server you have followed by lossless image compression)
Just create a copy of the original image upon upload (or whenever) and serve that image to your users. The benefits to this are you can have different sized images which can be rendered for different devices, e.g. small image for mobile, and again you can use lossless compression on all of them. The downside to this is you'll obviously be using more server space.
Hope this helps!
I am working with a project. And there is feature user can upload image. That image will be used in different pages of website, with different sizes(eg: 200*300, 360*520, 700*1000).
I can create thumbnail two ways
while uploading image, create thumbnail with different size and store.
While displaying image src to some server side script, re-size image from there and print image, instead of displaying.
Which is the correct way to do? If I use 1st method, I think disk space will get full very fast, Is there any issue with 2nd method?
The advantage of method 1 is that you won't risk resizing the same image twice simultaneously, and that you can provide quickly a version to display to your user.
But why would your disk get full fast if you size the images beforehand? That would happen if you re-size them to every possible size, so that's not a good idea.
Method 2 is more flexible. You just ask for an image in a given size and your script will produce it on the fly. This is a bit slower, of course, but you could cache the resized image so visitors will get the images fast, unless they are the first one to request an image in a specific size.
My suggestion:
If you know which sizes you use on your website, you could use those sizes to resize the images in an early stage. You could even make a configuration on your website with a bunch of predefined image dimensions, which you can use on your website, and use those same configurations to scale the images when you upload them. This has some advantages:
Having a limited set of sizes will increase the chances of hitting the cache when visitors browse through your website. For instance, if you show a picture of X in the X detail page, and overview page includeing X, search results etcetera, and each of those pages uses a slightly different size, it is a waste of disk space and bandwidth.
And, if disk size is an issue, having a limited number of sizes also limits the disk space that the cached versions of these images consume.
Moreover, if you change or add a dimension, you could pregenerate all images for that size immediately, so visitors would benefit right away from the caches version.
And of course, it also makes it easier to purge the cache for images of a dimension that is no longer in the list, should you remove or change one.
If you choose this method, it makes it very easy to implement method 1 and pre-cache everything, but even if you would choose method 2 (if only as a fallback, should a cached version not exist), then still this would have benefits.
I am trying to create an image gallery. My scroll bar gets rough and slows down when i try to add some 20 different images. Any suggestions??
Make sure you add the width and height attributes. On my webpage when I added them, for some reason it loaded a little bit faster on Safari and Chrome but made no difference on the other browsers.
You can also try lazy loading your images. It really makes a difference.
Make sure you resize your images to the size they are going to be displayed on your webpage.
Your question is vague, is it the images or scroll bars that are slow?
If images:
Never resize your images with css width height properties. This will load the entire image and be very slow. Instead resize images before uploading them, or have a thumb generator resize them on upload.
If the image is small (less than 100 pixels height or width) you can reduce it's quality when generating the thumb, so the file-size is smaller but to no noticable affect.
Use the correct file format.
- gif for animation
- png for small inconsequential graphics like logos and icons
- jpg for higher quality graphics like images
Are you gzipping content before sending it on the server?
Use a CDN for your images like AWS S3 yes, but DO NOT USE flick/picassa as they will 'look up' your image before serving it, which will be much slower.
Using just a CDN (AWS S3) will be better than your current hosting as they've been optimised for quick look ups, reducing the wait time by 50-90% compared to most web hosts.
Use only one CDN, once the DNS look up has taken place for the domain your browser will cache it.
If your site uses cookies or sessions store the images on a different URL (this can be a subdomain of your site), so that the cookies and session data aren't sent with each image request (slowing it down).
If the graphics are re-used set a time out parameter on them (a couple of months in the future), so that the browser caches them locally.
EDIT: I would also agree that loading images on demand/visibility in webpage is a very good practice.
If scroll bars:
Are you using custom scroll bars and or a custom scroll event?
If so, you may want to put a threshold (I recommend 100ms) for how often your resize event can be fired. Or manually fire the resize event after loading the images, as if in your scenario you know when the images have been loaded.
Try lazy loading your images using jQuery. Here's a plugin: http://www.appelsiini.net/projects/lazyload
Here's an example of when I've used lazy loading: https://www.lunatickets.co.uk (scroll down the page and watch the images).
Also make sure the browser isn't resizing your images! Make sure you've optimised them for the web!
Make sure you add width/height attributes to your img tags - other
wise your browser/website layout will "jump around" as it loads them.
If your inserting images try resizing them to the appropriate
width/height before you load them. Your browser will run slower if it
is resizing the large images to display smaller.
Try not to load all images at once.
Slideshows like this: http://sandbox.scriptiny.com/slideshow/, they only load the showed image plus some of the next.
The gallery I last used loaded the current image and the next 10 images. I don't remember the name, but I think that is the way to do it.
Have you tried keeping your images on S3, Flickr or Picasso? Let their servers take the hit for the HTTP requests
I'm building an image gallery which present a few images in the frontpage. The images are larger than the actual size displayed in the frontpage, which leads me to the following question:
If cache is not an option, what would be better:
Using php to shrink the image and send it to the client.
Send the original full size image and let the client shrink it (with simple width and height attributes)
I tend to think that the second is a better solution, but I'd like to hear more opinions.
Thanks!
Edit:
When people upload the images, I create thumbnails for them to be displayed when browsing the site.
The "cache is not an option" reason:
The discussed images are 5 "featured" images in the frontpage which will not stay the same for more than an hour max. so isn't it a waste to create another image copy of every uploaded image just for that?
Essentially, it depends on
What's the original-to-desired width/height ratio? It's not a big deal serving a 500x500 image and showing it as 250x250, but wasting bandwidth on 1920x1080 images is. Also, mobile devices might not have enough resources available to actually display the webpage if you serve too many big images.
What do you have more of: bandwidth or CPU power? Can you make sure nobody uses your on-the-fly resizer as DOS target?
Generally solutions with a cache, even a very temporary one, are much better though.
[AD edit]
The discussed images are 5 "featured" images in the frontpage which
will not stay the same for more than an hour max. so isn't it a waste
to create another image copy of every uploaded image just for that?
It is. But you could simply create a separate folder for these thumbnails, and setup a cron job to wipe files older than an hour. Here's an example I use on my site (set to 30 minutes):
*/15 * * * * find /var/www/directory/ -mmin +30 -exec rm -f {} \; >/dev/null 2>&1
Given 'enough' CPU ressources I would prefer to shrink images before sending them to go easy on people with bad connections and mobile devices.
Another option and my preferred strategy would be to keep smaller versions of the images and then use them. If the images are uploaded at some point, then create a smaller version of the image on upload.
It kind of depends on your flow, but I would resize them on-the-fly and save the thumb. So if the thumb exist, serve it, if not resize on the fly and serve that (while saving the thumb).
Then in a cronjob you can remove old images.
How about 3. don't resize the images in the process that's supposed to serve them to client - make a background process do the resizing, send the thumbnails if resized, full images if not yet. This has the advantage that you can throttle the resizing process independently of the user requests.
Imagine you have a classifieds website...
When searching ads you want image thumbnails of the "real" image which displays in its real size after clicking the ad...
Would it be faster to create the thumbnails per search, or create the thumbnails on and then just display them?
Storage is not a problem on my server...
Thanks
I would create the thumbnail once the original image has been uploaded, this way there's no slowdown when the person's page is first hit.
However, lately I've been using an image resize script from Shifting Pixel (http://shiftingpixel.com/2008/03/03/smart-image-resizer/). It creates a resized version on the fly, and caches it, so subsequent hits to the page will use the cached version. This could be useful if you don't want to create the thumbnail yourself.
Assuming that the images are of a fixed size, then it is better to create them upfront. Otherwise you have at minimum a per-request check for existence per image. If they are created ahead of time, we assume that the images exist.
Creating the thumbnail dynamically is your best bet, especially considering you'll probably reformat the search results at some point and will likely end up choosing a different size for the thumbnail. If you only have a thumbnail thats 150x150 pixels but after a redesign of the results page you want thumbnails to be a little larger, say 300x300, you'll have to regenerate all the thumbnails again. If you create them dynamically on request you can just alter your resize script to give you thumbnails that are 300x300.