When should I create thumbnail of an image? - php

I am working with a project. And there is feature user can upload image. That image will be used in different pages of website, with different sizes(eg: 200*300, 360*520, 700*1000).
I can create thumbnail two ways
while uploading image, create thumbnail with different size and store.
While displaying image src to some server side script, re-size image from there and print image, instead of displaying.
Which is the correct way to do? If I use 1st method, I think disk space will get full very fast, Is there any issue with 2nd method?

The advantage of method 1 is that you won't risk resizing the same image twice simultaneously, and that you can provide quickly a version to display to your user.
But why would your disk get full fast if you size the images beforehand? That would happen if you re-size them to every possible size, so that's not a good idea.
Method 2 is more flexible. You just ask for an image in a given size and your script will produce it on the fly. This is a bit slower, of course, but you could cache the resized image so visitors will get the images fast, unless they are the first one to request an image in a specific size.
My suggestion:
If you know which sizes you use on your website, you could use those sizes to resize the images in an early stage. You could even make a configuration on your website with a bunch of predefined image dimensions, which you can use on your website, and use those same configurations to scale the images when you upload them. This has some advantages:
Having a limited set of sizes will increase the chances of hitting the cache when visitors browse through your website. For instance, if you show a picture of X in the X detail page, and overview page includeing X, search results etcetera, and each of those pages uses a slightly different size, it is a waste of disk space and bandwidth.
And, if disk size is an issue, having a limited number of sizes also limits the disk space that the cached versions of these images consume.
Moreover, if you change or add a dimension, you could pregenerate all images for that size immediately, so visitors would benefit right away from the caches version.
And of course, it also makes it easier to purge the cache for images of a dimension that is no longer in the list, should you remove or change one.
If you choose this method, it makes it very easy to implement method 1 and pre-cache everything, but even if you would choose method 2 (if only as a fallback, should a cached version not exist), then still this would have benefits.

Related

converting image to all sizes on upload vs resizing on request via php

So I have a platform for users which allows them to upload a fair amount of pictures. At the moment, I have my server resizing and saving all images individually to my CDN (so I can pick the best option in order to reduce load time when a user requests to view it), but it seems very wasteful in regards to server storage.
The images are being converted into resolutions of 1200px, 500px, 140px, 40px and 24px.
What I'm wondering is, would it be more efficient to just save the file at 1200px, then serve it via PHP at the requested size using something like ImageMagick? Would there be any major trade-offs and if so, is it worth it?
What I'm doing right now:
https://v1x-3.hbcdn.net/user/filename-500x500.jpg
An example of what I could do:
https://v1x-3.hbcdn.net/image.php?type=user&file=filename&resolution=500
Cheers.
No it's not, because:
you have a small number of sizes
if you will not use caching (image generation on first request only) you can DDOS yourself (image processing its a cpu affected process)
have to do extra work if will use CDN like Cloudflare for HTTP-caching
It makes sense if you have a lot sizes of images, for example, API that supports multiple Andoid/IOS devices, meaning iphone 3 supports 320x320 image only and if you dont have users with such device, your server never creates such image.
Advice:
During image generation, use optimization it reduces image size with imperceptible loss of quality.

Should I generate each thumbnail dynamically every time it is requested, or store them on image upload? [duplicate]

This question already has answers here:
PHP image resize on the fly vs storing resized images
(4 answers)
Closed 9 years ago.
Problem - I wanted to set up an image-uploading feature in my website. But I wanted to show both- the original image and a small thumbnail of the image.
Choices - Which way is better - to create a separate image (thumbnail) in a directory when the image is uploaded or to show a smaller version by reducing its height and width in the fixed ratio every time the image is requested?
How I am doing it currently - The later one sounds better to me because it won't be taking much size on the disk but it has to resize the image again and again. Which one do you think is better?
This is a general question for web application, no language in specific.
Any idea how facebook or google do it?
Question - My question is how to generate thumbnails and show them on a website - by creating a copy of the original image with smaller dimension or by generating the thumbnail dynamically every time it is requested.
Creating the thumbnail on upload is almost always the better option. If storage is a concern you could convert them on the request and then cache the result in a memory store. If its requested again before the cache expires, no conversion will be needed.
Storage is often pretty cheap, so I would probably not go for that extra complexity.
Just create a thumbnail version and save to disk. Hard disk space is very cheap. £70 for a couple of TB.
"Better" depends on the criteria you set.
For most applications, disk space is not an issue - and if storing a thumbnail is a problem, storing the original must be a huge concern - a decent digital camera photo will run to many megabytes, whereas the thumbnail should not exceed 50K.
Bandwidth and performance (as perceived by the client) are usually bigger concerns. If you have lots of people browsing a gallery of image thumbnails, serving 50Kb thumbnails will be significantly faster (and cheaper in bandwidth) than serving multi-megabyte high resolution images.
In addition, by serving thumbnails on a URL like <img src="images/thumbnail/foobar.jpg"> and setting appropriate cache headers, you should get a lot of downstream caching - this is less likely if you serve the image as <img src="thumbnail.php?image=image/foobar.jpg> because caches treat querystrings rather conservatively.
I used to work on a web site that managed hundreds of thousands of product images; we set up ImageMagick to create thumbnails automatically. Depending on your setup, it may make sense to do this when the thumbnail is first requested, rather than when the file is uploaded, because the conversion can be fairly resource hungry, and doing it at upload time would take longer than we wanted to wait. Modern hardware may make that a non-issue.
There's also a question about keeping the thumbnails in sync with the originals - if the user uploads a new image, you have to ensure you get the thumbnail updated; if the original is deleted, you must also delete the thumbnail.
Creating a thumbnail is a better option and it doesn't cost much disk space. Your client will also load smaller size when opening your pages. converting the image upon request will cost event more time to load your page ;)
If you take a look at most CMS with this built in functionality they nearly always create a thumbnail image of the image on upload and store it on the server.
This goes back to the age old saying of "do what google does" but with CMS.

php image loading time consumption

I have a products website where in I have around 100 images of high quality. Each image is around 6-7MB in size.
In my database I have stored the path of all the images along with their names. The images are saved in a folder /images/product_name/, But when I go to display these images in a web page, the page takes forever to load. All I do is send the id to the table, get the image paths and display it in the products page.
It would be very helpful if I could get any sort of advice on how to optimize the process.
The images you send to the client are most likely way too big. 7MB of size sounds very large for a product picture so 100*7MB = 700MB of data transfered if you display all the product images.
If you only need small images, scale them down to some KB's (thumbnails) and use those to display in your table.
NOTE: you can just preprend a prefix like "tmb_" or "tmb_200x200_" to the original filename, and you won't have to touch the paths in the database.
From reading your comments I think you are looking for some automated process to optimize the files and serve them at a more decent filesize.
You should take a look at imageMagick or the GD library which allows you to resize images (among many other things - http://php.net/manual/en/book.imagick.php) and optimize them. This can be combined with something like the YUI Image cropper to allow you to choose certain parts of the image to show in the thumbnail.
This is best done at the point of upload so the Server doesn't unnecessarily regenerate the images each time they are requested, and stored under a thumbnail column in the database.
If you must show bigger images I suggest you use a lightbox (see an example here - http://leandrovieira.com/projects/jquery/lightbox/) or similar technique that only loads the larger image when the client requests it.
Use http://luci.criosweb.ro/riot/ to compress images.
It is one of the simple and free tools recommended by Google to compress images.
Also as it was mentioned before make sure the thumbs you send to the listing pages are around 100px and only if the user clicks to zoom in show him the large ones one by one, and still even in the zoom option you should only display a 1000px size image max and that should not have more then 200-500k depending on format and quality.
Do you have optimized your images (in term of file size) ? Like on Smush.it for example http://www.smushit.com/ysmush.it/
You should really optimize your images. You can use a commercial product like Photoshop or use a free one like the GIMP to optimize the files.
Loading about 700 MBs of images will take exactly 1 day for me!!
reduce the size of the images to a thumbnail and when the user clicks on the thumbnail then load the original file. This can increase the performance.

image saving, resizing

I have one basic question. I have project where I need more sizes of one picture.
Yes... During uploading you make thumbnails... and so on... I know this story ... performance vs. storing possibilities.
So I save original img, a make 2 thumbnails copies for example max width 100px and maxwidht 200px with respect to ratio.
Now I need show image in 150px max width so I take saved img(200px) and .....
I use getimagesize() for calculating showing width and height respected to ratio,
or I set max-widht and max-height and I leave it for browser (browser make it for me),
or I set width a keep height: auto (but I want also limit max height)
So actualy I use php and getimagesize() but this function every time work with file and I am little scared. When you process 1 img it is OK but what about 20 or 100.
And... another idea, while uploading I save to DB also size information, for this I have to save data for 3 img (now only original one) this complicate everything.
So ... any ideas? What is your practice? THX.
Two images, at a maximum: A thumbnail, and the original image are sufficient. Make sure that your upload page is well-secured, because I've seen a website taken down through DoS (abusing an unprotected image-resizing page). Also limit the maximum upload size, to prevent abuse.
You can use the max-width and max-height CSS properties to limit the size of your images.
My approach
I wrote a pretty simple gallery application in php a while ago and this is how it works:
The images are stored in a folder with subfolders representing albums (and subalbums). They are uploaded via FTP and the webserver only has read-permissions on them.
For each image there are three versions:
a full one (the original)
a "mid" one (1024x768px max)
a "thumb" one (250x250px max)
All requests for images by the browser are served by php, and not-yet-existing versions are generated on the fly. The actual data is served through X-Sendfile, but that's an implementation detail.
I store the smaller versions in separate directories. When given a path to an original image, it is trivial to find the corresponding downscaled files (and check for existence and modification times).
Thoughts on your problem
Scaling images using HTML / CSS is considered bad practice for two simple reasons: if you are scaling up, you have a blurred image. If you are scaling down, you waste bandwidth and make your page slower for no good reason. So don't do it.
It should be possible to determine a pretty small set of required versions of each file (for example those used in a layout as in my case). Depending on the size and requirements of your project there are a few possibilities for creating those versions:
on the fly: generate / update them, when they are requested
during upload: have the routine that is called during the upload-process do the work
in the background: have the upload-routine add a job to a queue that is worked on in the background (probably most scalable but also fairly complex to implement and deploy)
Scaling down large images is a pretty slow operation (taking a few seconds usually). You might want to throttle it somehow to prevent abuse / DoS. Also limit dimensions and file size. A 100 MP (or even bigger) plain white (or any color) JPG might be very small when compressed, but will use an awful lot of RAM during scaling. Also big PNGs take really long to decompress (and even more to compress).
For a small website it doesn't matter, which approach you choose. Something that works (even if it doesn't scale) will do. If you plan on getting a good amount of traffic and a steady stream of uploads, then choose wisely and benchmark carefully.

PHP Image Optimizer

We are building a web app which will have a lot of images being uploaded. Which is the best Solution to optimize these images and store it in on the website ?
And also is there a way i can also auto enhance the images which are being uploaded ?
Do not store images in DB, store them in file system (as real files). You'll probably need to store information about them in DB though, e.g., filename, time of upload, size, owner etc.
Filenames must be unique. You might use yyyymmddhhiissnnnn, where yyyymmdd is year, month and date, hhiiss - hour, minutes and seconds, nnnn - number of image in that second, i.e., 0001 for first image, 0002 for second image etc. This will give you unique filenames with fine ordering.
Think about making some logic directory structure. Storing millions of images in single folder is not a good idea, so you will need to have something like images/<x>/<y>/<z>/<filename>. This could also be spanned across multiple servers.
Keep original images. You never know what you will want to do with images after year or two. You can convert them to some common format though, i.e., if you allow to upload JPG, PNG and other formats, you might store all of them as JPG.
Create and store all kinds of resized images that are necessary in your website. For example, social networks often have 3 kinds of resized images - one for displaying with user comments in various places (very small), one for displaying in profile page (quite small, but not like icon; might be some 240x320 pixels) and one for "full size" viewing (often smaller than original). Filenames of these related images should be similar to filenames of original images, e.g., suffixes _icon, _profile and _full might be added to filenames of original images. Depending on your resources and the amount of images that are being uploaded at the same time, you can do this either in realtime (in the same HTTP request) or use some background processing (cron job that continuously checks if there are new images to be converted).
As for auto enhancing images - it is possible, but only if you know exactly what must be done with images. I think that analyzing every image and deciding what should be done with it might be too complex and take too much resources.
All good suggestions from binaryLV above. Adding to his suggestion #5, you also probably want to optimize the thumbnails you create. When images are uploaded, they are likely to have metadata that is unnecessary for the thumbnails to have. You can losslessly remove the metadata to make the thumbnail sizes smaller, as suggested here: http://code.google.com/speed/page-speed/docs/payload.html#CompressImages. I personally use jpegtran on the images for my website to automatically optimize my thumbnails whenever they are created. If you ever need the metadata, you can get it from the original image.
Something else to consider if you plan to display these images for users is to host your images on a cookie-free domain or sub-domain as mentioned here: http://developer.yahoo.com/performance/rules.html#cookie_free. If the images are hosted on a domain or sub-domain that has cookies, then every image will send along an unnecessary cookie. It can save a few KB per image requested, which can add up to a decent amount, especially on restricted bandwidth connection such as on a mobile device.

Categories