php image resize on-the-fly vs full size image - php

I'm building an image gallery which present a few images in the frontpage. The images are larger than the actual size displayed in the frontpage, which leads me to the following question:
If cache is not an option, what would be better:
Using php to shrink the image and send it to the client.
Send the original full size image and let the client shrink it (with simple width and height attributes)
I tend to think that the second is a better solution, but I'd like to hear more opinions.
Thanks!
Edit:
When people upload the images, I create thumbnails for them to be displayed when browsing the site.
The "cache is not an option" reason:
The discussed images are 5 "featured" images in the frontpage which will not stay the same for more than an hour max. so isn't it a waste to create another image copy of every uploaded image just for that?

Essentially, it depends on
What's the original-to-desired width/height ratio? It's not a big deal serving a 500x500 image and showing it as 250x250, but wasting bandwidth on 1920x1080 images is. Also, mobile devices might not have enough resources available to actually display the webpage if you serve too many big images.
What do you have more of: bandwidth or CPU power? Can you make sure nobody uses your on-the-fly resizer as DOS target?
Generally solutions with a cache, even a very temporary one, are much better though.
[AD edit]
The discussed images are 5 "featured" images in the frontpage which
will not stay the same for more than an hour max. so isn't it a waste
to create another image copy of every uploaded image just for that?
It is. But you could simply create a separate folder for these thumbnails, and setup a cron job to wipe files older than an hour. Here's an example I use on my site (set to 30 minutes):
*/15 * * * * find /var/www/directory/ -mmin +30 -exec rm -f {} \; >/dev/null 2>&1

Given 'enough' CPU ressources I would prefer to shrink images before sending them to go easy on people with bad connections and mobile devices.
Another option and my preferred strategy would be to keep smaller versions of the images and then use them. If the images are uploaded at some point, then create a smaller version of the image on upload.

It kind of depends on your flow, but I would resize them on-the-fly and save the thumb. So if the thumb exist, serve it, if not resize on the fly and serve that (while saving the thumb).
Then in a cronjob you can remove old images.

How about 3. don't resize the images in the process that's supposed to serve them to client - make a background process do the resizing, send the thumbnails if resized, full images if not yet. This has the advantage that you can throttle the resizing process independently of the user requests.

Related

When should I create thumbnail of an image?

I am working with a project. And there is feature user can upload image. That image will be used in different pages of website, with different sizes(eg: 200*300, 360*520, 700*1000).
I can create thumbnail two ways
while uploading image, create thumbnail with different size and store.
While displaying image src to some server side script, re-size image from there and print image, instead of displaying.
Which is the correct way to do? If I use 1st method, I think disk space will get full very fast, Is there any issue with 2nd method?
The advantage of method 1 is that you won't risk resizing the same image twice simultaneously, and that you can provide quickly a version to display to your user.
But why would your disk get full fast if you size the images beforehand? That would happen if you re-size them to every possible size, so that's not a good idea.
Method 2 is more flexible. You just ask for an image in a given size and your script will produce it on the fly. This is a bit slower, of course, but you could cache the resized image so visitors will get the images fast, unless they are the first one to request an image in a specific size.
My suggestion:
If you know which sizes you use on your website, you could use those sizes to resize the images in an early stage. You could even make a configuration on your website with a bunch of predefined image dimensions, which you can use on your website, and use those same configurations to scale the images when you upload them. This has some advantages:
Having a limited set of sizes will increase the chances of hitting the cache when visitors browse through your website. For instance, if you show a picture of X in the X detail page, and overview page includeing X, search results etcetera, and each of those pages uses a slightly different size, it is a waste of disk space and bandwidth.
And, if disk size is an issue, having a limited number of sizes also limits the disk space that the cached versions of these images consume.
Moreover, if you change or add a dimension, you could pregenerate all images for that size immediately, so visitors would benefit right away from the caches version.
And of course, it also makes it easier to purge the cache for images of a dimension that is no longer in the list, should you remove or change one.
If you choose this method, it makes it very easy to implement method 1 and pre-cache everything, but even if you would choose method 2 (if only as a fallback, should a cached version not exist), then still this would have benefits.

Should I generate each thumbnail dynamically every time it is requested, or store them on image upload? [duplicate]

This question already has answers here:
PHP image resize on the fly vs storing resized images
(4 answers)
Closed 9 years ago.
Problem - I wanted to set up an image-uploading feature in my website. But I wanted to show both- the original image and a small thumbnail of the image.
Choices - Which way is better - to create a separate image (thumbnail) in a directory when the image is uploaded or to show a smaller version by reducing its height and width in the fixed ratio every time the image is requested?
How I am doing it currently - The later one sounds better to me because it won't be taking much size on the disk but it has to resize the image again and again. Which one do you think is better?
This is a general question for web application, no language in specific.
Any idea how facebook or google do it?
Question - My question is how to generate thumbnails and show them on a website - by creating a copy of the original image with smaller dimension or by generating the thumbnail dynamically every time it is requested.
Creating the thumbnail on upload is almost always the better option. If storage is a concern you could convert them on the request and then cache the result in a memory store. If its requested again before the cache expires, no conversion will be needed.
Storage is often pretty cheap, so I would probably not go for that extra complexity.
Just create a thumbnail version and save to disk. Hard disk space is very cheap. £70 for a couple of TB.
"Better" depends on the criteria you set.
For most applications, disk space is not an issue - and if storing a thumbnail is a problem, storing the original must be a huge concern - a decent digital camera photo will run to many megabytes, whereas the thumbnail should not exceed 50K.
Bandwidth and performance (as perceived by the client) are usually bigger concerns. If you have lots of people browsing a gallery of image thumbnails, serving 50Kb thumbnails will be significantly faster (and cheaper in bandwidth) than serving multi-megabyte high resolution images.
In addition, by serving thumbnails on a URL like <img src="images/thumbnail/foobar.jpg"> and setting appropriate cache headers, you should get a lot of downstream caching - this is less likely if you serve the image as <img src="thumbnail.php?image=image/foobar.jpg> because caches treat querystrings rather conservatively.
I used to work on a web site that managed hundreds of thousands of product images; we set up ImageMagick to create thumbnails automatically. Depending on your setup, it may make sense to do this when the thumbnail is first requested, rather than when the file is uploaded, because the conversion can be fairly resource hungry, and doing it at upload time would take longer than we wanted to wait. Modern hardware may make that a non-issue.
There's also a question about keeping the thumbnails in sync with the originals - if the user uploads a new image, you have to ensure you get the thumbnail updated; if the original is deleted, you must also delete the thumbnail.
Creating a thumbnail is a better option and it doesn't cost much disk space. Your client will also load smaller size when opening your pages. converting the image upon request will cost event more time to load your page ;)
If you take a look at most CMS with this built in functionality they nearly always create a thumbnail image of the image on upload and store it on the server.
This goes back to the age old saying of "do what google does" but with CMS.

php image loading time consumption

I have a products website where in I have around 100 images of high quality. Each image is around 6-7MB in size.
In my database I have stored the path of all the images along with their names. The images are saved in a folder /images/product_name/, But when I go to display these images in a web page, the page takes forever to load. All I do is send the id to the table, get the image paths and display it in the products page.
It would be very helpful if I could get any sort of advice on how to optimize the process.
The images you send to the client are most likely way too big. 7MB of size sounds very large for a product picture so 100*7MB = 700MB of data transfered if you display all the product images.
If you only need small images, scale them down to some KB's (thumbnails) and use those to display in your table.
NOTE: you can just preprend a prefix like "tmb_" or "tmb_200x200_" to the original filename, and you won't have to touch the paths in the database.
From reading your comments I think you are looking for some automated process to optimize the files and serve them at a more decent filesize.
You should take a look at imageMagick or the GD library which allows you to resize images (among many other things - http://php.net/manual/en/book.imagick.php) and optimize them. This can be combined with something like the YUI Image cropper to allow you to choose certain parts of the image to show in the thumbnail.
This is best done at the point of upload so the Server doesn't unnecessarily regenerate the images each time they are requested, and stored under a thumbnail column in the database.
If you must show bigger images I suggest you use a lightbox (see an example here - http://leandrovieira.com/projects/jquery/lightbox/) or similar technique that only loads the larger image when the client requests it.
Use http://luci.criosweb.ro/riot/ to compress images.
It is one of the simple and free tools recommended by Google to compress images.
Also as it was mentioned before make sure the thumbs you send to the listing pages are around 100px and only if the user clicks to zoom in show him the large ones one by one, and still even in the zoom option you should only display a 1000px size image max and that should not have more then 200-500k depending on format and quality.
Do you have optimized your images (in term of file size) ? Like on Smush.it for example http://www.smushit.com/ysmush.it/
You should really optimize your images. You can use a commercial product like Photoshop or use a free one like the GIMP to optimize the files.
Loading about 700 MBs of images will take exactly 1 day for me!!
reduce the size of the images to a thumbnail and when the user clicks on the thumbnail then load the original file. This can increase the performance.

understanding how timthumb works

Coming from Rails background, I used to work with paperclip plugin, which was creating from images attachments thumbs with predefined sizes.
In wordpress, I am little confused. Here is my question or points that need more clarification:
Does timthumb creates thumbs and saves it to disk upon uploading images for the first time? or It just resizes images on the fly and caches them?
If it resizes on the fly, why I see different sizes of each image in uploads directory, such as filename.jpg filename-150x150.jpg and so on?
Isn't better for performance to just create the thumbs once upon uploading and serving them directly without calling a script? and if so, how to implement this?
Typically it resizes and caches the images upon their first request.
You're probably seeing the image resizing that WP does. These are controlled in the settings
Debatable. Yes in that it could all be done at your command, no in that someone could upload 1000 images and resizing them all at once could cause problems with the site. Thus spreading out the resizing could result in balancing the load demands. Also, parameters can be passed via the script when a page loads that creates a custom thumbnail. So if you decide your thumbnails were 10px too narrow, you can run it again and it will resize from the original. Plus the filename of the base image remains untouched - if your code says image.jpg, it will always be image.jpg, not matter the size. So if you've got 10,000 instances of thumbnails, and they all reference image-150x150 and now you want them to be 160x160, you either have to change the image names being referenced, or have a nonsensical filename. TimThumb provides a pretty good workaround for that.
Here's some basic timthumb/wordpess performance tips http://www.dollarshower.com/timthumb-and-wordpress-blog-performance/

Loading too many images, slows down my website

I am trying to create an image gallery. My scroll bar gets rough and slows down when i try to add some 20 different images. Any suggestions??
Make sure you add the width and height attributes. On my webpage when I added them, for some reason it loaded a little bit faster on Safari and Chrome but made no difference on the other browsers.
You can also try lazy loading your images. It really makes a difference.
Make sure you resize your images to the size they are going to be displayed on your webpage.
Your question is vague, is it the images or scroll bars that are slow?
If images:
Never resize your images with css width height properties. This will load the entire image and be very slow. Instead resize images before uploading them, or have a thumb generator resize them on upload.
If the image is small (less than 100 pixels height or width) you can reduce it's quality when generating the thumb, so the file-size is smaller but to no noticable affect.
Use the correct file format.
- gif for animation
- png for small inconsequential graphics like logos and icons
- jpg for higher quality graphics like images
Are you gzipping content before sending it on the server?
Use a CDN for your images like AWS S3 yes, but DO NOT USE flick/picassa as they will 'look up' your image before serving it, which will be much slower.
Using just a CDN (AWS S3) will be better than your current hosting as they've been optimised for quick look ups, reducing the wait time by 50-90% compared to most web hosts.
Use only one CDN, once the DNS look up has taken place for the domain your browser will cache it.
If your site uses cookies or sessions store the images on a different URL (this can be a subdomain of your site), so that the cookies and session data aren't sent with each image request (slowing it down).
If the graphics are re-used set a time out parameter on them (a couple of months in the future), so that the browser caches them locally.
EDIT: I would also agree that loading images on demand/visibility in webpage is a very good practice.
If scroll bars:
Are you using custom scroll bars and or a custom scroll event?
If so, you may want to put a threshold (I recommend 100ms) for how often your resize event can be fired. Or manually fire the resize event after loading the images, as if in your scenario you know when the images have been loaded.
Try lazy loading your images using jQuery. Here's a plugin: http://www.appelsiini.net/projects/lazyload
Here's an example of when I've used lazy loading: https://www.lunatickets.co.uk (scroll down the page and watch the images).
Also make sure the browser isn't resizing your images! Make sure you've optimised them for the web!
Make sure you add width/height attributes to your img tags - other
wise your browser/website layout will "jump around" as it loads them.
If your inserting images try resizing them to the appropriate
width/height before you load them. Your browser will run slower if it
is resizing the large images to display smaller.
Try not to load all images at once.
Slideshows like this: http://sandbox.scriptiny.com/slideshow/, they only load the showed image plus some of the next.
The gallery I last used loaded the current image and the next 10 images. I don't remember the name, but I think that is the way to do it.
Have you tried keeping your images on S3, Flickr or Picasso? Let their servers take the hit for the HTTP requests

Categories