I'm developing IPS 4. I have profile popup:
The profile cover loads very long, because cover dimensions and size are big. So I've decided to make a PHP API which resizes images to needed size and then displays the resized image.
Is this good idea to make cover upload faster?
You need to populate a 436x85 box with user-provided pictures.
My own digital camera has a 18 MPx sensor that produces 4896x3672 pictures that use around 7 MB when compressed as JPEG. Imagine you display e.g. a dozen profiles per page. That's 84 MB worth of network transfer (more than a typical MP3-encoded music album) for a single page. JPEG compression roughly accomplishes a 1/10 ratio so you can assume 840 MB of RAM just to store the pictures. And they you have the overhead of having the browser resample pictures real time.
On the other size, a 436x85 JPEG can use 8 to 22 KB on average (depending on quality settings).
So if you use the raw pictures uploaded by users, of course it's not going to be fast.
Conclusion: always resize pictures yourself. And please do it only once, it's a heavy process even for your server.
Yes, it is a good idea to store not only the original image, but the resized ones too, because each time a new user is requesting certain page he is getting this big image, what is basically a waste of transfer and makes user wait what leads to poor user experience.
You should make a script which resizes and saves newly uploaded images to your server and use them instead of these big original ones. But also don't forget that resizing is really CPU-heavy, so it would be a good idea to queue this action and not do it instantly during the user's request.
Related
So I have a platform for users which allows them to upload a fair amount of pictures. At the moment, I have my server resizing and saving all images individually to my CDN (so I can pick the best option in order to reduce load time when a user requests to view it), but it seems very wasteful in regards to server storage.
The images are being converted into resolutions of 1200px, 500px, 140px, 40px and 24px.
What I'm wondering is, would it be more efficient to just save the file at 1200px, then serve it via PHP at the requested size using something like ImageMagick? Would there be any major trade-offs and if so, is it worth it?
What I'm doing right now:
https://v1x-3.hbcdn.net/user/filename-500x500.jpg
An example of what I could do:
https://v1x-3.hbcdn.net/image.php?type=user&file=filename&resolution=500
Cheers.
No it's not, because:
you have a small number of sizes
if you will not use caching (image generation on first request only) you can DDOS yourself (image processing its a cpu affected process)
have to do extra work if will use CDN like Cloudflare for HTTP-caching
It makes sense if you have a lot sizes of images, for example, API that supports multiple Andoid/IOS devices, meaning iphone 3 supports 320x320 image only and if you dont have users with such device, your server never creates such image.
Advice:
During image generation, use optimization it reduces image size with imperceptible loss of quality.
I am working with a project. And there is feature user can upload image. That image will be used in different pages of website, with different sizes(eg: 200*300, 360*520, 700*1000).
I can create thumbnail two ways
while uploading image, create thumbnail with different size and store.
While displaying image src to some server side script, re-size image from there and print image, instead of displaying.
Which is the correct way to do? If I use 1st method, I think disk space will get full very fast, Is there any issue with 2nd method?
The advantage of method 1 is that you won't risk resizing the same image twice simultaneously, and that you can provide quickly a version to display to your user.
But why would your disk get full fast if you size the images beforehand? That would happen if you re-size them to every possible size, so that's not a good idea.
Method 2 is more flexible. You just ask for an image in a given size and your script will produce it on the fly. This is a bit slower, of course, but you could cache the resized image so visitors will get the images fast, unless they are the first one to request an image in a specific size.
My suggestion:
If you know which sizes you use on your website, you could use those sizes to resize the images in an early stage. You could even make a configuration on your website with a bunch of predefined image dimensions, which you can use on your website, and use those same configurations to scale the images when you upload them. This has some advantages:
Having a limited set of sizes will increase the chances of hitting the cache when visitors browse through your website. For instance, if you show a picture of X in the X detail page, and overview page includeing X, search results etcetera, and each of those pages uses a slightly different size, it is a waste of disk space and bandwidth.
And, if disk size is an issue, having a limited number of sizes also limits the disk space that the cached versions of these images consume.
Moreover, if you change or add a dimension, you could pregenerate all images for that size immediately, so visitors would benefit right away from the caches version.
And of course, it also makes it easier to purge the cache for images of a dimension that is no longer in the list, should you remove or change one.
If you choose this method, it makes it very easy to implement method 1 and pre-cache everything, but even if you would choose method 2 (if only as a fallback, should a cached version not exist), then still this would have benefits.
This question already has answers here:
PHP image resize on the fly vs storing resized images
(4 answers)
Closed 9 years ago.
Problem - I wanted to set up an image-uploading feature in my website. But I wanted to show both- the original image and a small thumbnail of the image.
Choices - Which way is better - to create a separate image (thumbnail) in a directory when the image is uploaded or to show a smaller version by reducing its height and width in the fixed ratio every time the image is requested?
How I am doing it currently - The later one sounds better to me because it won't be taking much size on the disk but it has to resize the image again and again. Which one do you think is better?
This is a general question for web application, no language in specific.
Any idea how facebook or google do it?
Question - My question is how to generate thumbnails and show them on a website - by creating a copy of the original image with smaller dimension or by generating the thumbnail dynamically every time it is requested.
Creating the thumbnail on upload is almost always the better option. If storage is a concern you could convert them on the request and then cache the result in a memory store. If its requested again before the cache expires, no conversion will be needed.
Storage is often pretty cheap, so I would probably not go for that extra complexity.
Just create a thumbnail version and save to disk. Hard disk space is very cheap. £70 for a couple of TB.
"Better" depends on the criteria you set.
For most applications, disk space is not an issue - and if storing a thumbnail is a problem, storing the original must be a huge concern - a decent digital camera photo will run to many megabytes, whereas the thumbnail should not exceed 50K.
Bandwidth and performance (as perceived by the client) are usually bigger concerns. If you have lots of people browsing a gallery of image thumbnails, serving 50Kb thumbnails will be significantly faster (and cheaper in bandwidth) than serving multi-megabyte high resolution images.
In addition, by serving thumbnails on a URL like <img src="images/thumbnail/foobar.jpg"> and setting appropriate cache headers, you should get a lot of downstream caching - this is less likely if you serve the image as <img src="thumbnail.php?image=image/foobar.jpg> because caches treat querystrings rather conservatively.
I used to work on a web site that managed hundreds of thousands of product images; we set up ImageMagick to create thumbnails automatically. Depending on your setup, it may make sense to do this when the thumbnail is first requested, rather than when the file is uploaded, because the conversion can be fairly resource hungry, and doing it at upload time would take longer than we wanted to wait. Modern hardware may make that a non-issue.
There's also a question about keeping the thumbnails in sync with the originals - if the user uploads a new image, you have to ensure you get the thumbnail updated; if the original is deleted, you must also delete the thumbnail.
Creating a thumbnail is a better option and it doesn't cost much disk space. Your client will also load smaller size when opening your pages. converting the image upon request will cost event more time to load your page ;)
If you take a look at most CMS with this built in functionality they nearly always create a thumbnail image of the image on upload and store it on the server.
This goes back to the age old saying of "do what google does" but with CMS.
I'm in the process of making a website for a school group and the website will have a lot of pictures of its activities. I'm using lightbox so the pictures will display as a slideshow and I changed the dimension of the pictures so the "current" picture on slideshow isn't as big as the original. I still find about 5 seconds delay from opening the picture or going to the next picture. I'm wondering if there's a way to achieve a faster load time or even another method I didn't consider.
I'm using xhtml, css, and php for my site.
Sorry for answering, but I can't write comment now...
I'm doing it other way, after upload, I resampled picture to big picture and thumbnail.
Ofcourse you can resampled picture everytime, but this is propably reason, why you have to wait so long, and it is lots of work for server, if you have many visitors in one moment.
So for me is best way to said, that big image is 640x480 max, and then save picture after upload to this size, ofcourse resampled by the same ratio.
Edit: From your post, I can't know if you resizing/resampling image, and where, in HTML by setting height and width or in PHP and how often.
Let's say you have a picture on the server, and its size is 8000x6000. its size could be something like 10 MB.
Now let's say you want to display this image in a web page and do it like this:
<img src="largeImage.jpg" width="800" height="600"/>
The browser will download the large image (10 MB), which takes a whole lot of time, and will then resize it to 800x600 in memory to display it on the web page (this consumes memory and CPU time). Total time: 25 seconds.
Now suppose you resize this image to 800x600 and put this resized image on the server. You then display the image with
<img src="smallImage.jpg" width="800" height="600"/>
The small image will look identical to the user visiting your web page, but the browser will only have to download a small image, which will be something like 100 KB large (100 times less than the large image). The time taken to download the image will be divided by 100 as well (0.25 seconds), and the browser won't have to load and resize a huge image in memory (less memory, less CPU time). Your image will be visible almost instantly.
There are many tools (I use Irfanview myself) which are able to take a large collection of images and resize them all at once. You should do that.
I'm building an image gallery which present a few images in the frontpage. The images are larger than the actual size displayed in the frontpage, which leads me to the following question:
If cache is not an option, what would be better:
Using php to shrink the image and send it to the client.
Send the original full size image and let the client shrink it (with simple width and height attributes)
I tend to think that the second is a better solution, but I'd like to hear more opinions.
Thanks!
Edit:
When people upload the images, I create thumbnails for them to be displayed when browsing the site.
The "cache is not an option" reason:
The discussed images are 5 "featured" images in the frontpage which will not stay the same for more than an hour max. so isn't it a waste to create another image copy of every uploaded image just for that?
Essentially, it depends on
What's the original-to-desired width/height ratio? It's not a big deal serving a 500x500 image and showing it as 250x250, but wasting bandwidth on 1920x1080 images is. Also, mobile devices might not have enough resources available to actually display the webpage if you serve too many big images.
What do you have more of: bandwidth or CPU power? Can you make sure nobody uses your on-the-fly resizer as DOS target?
Generally solutions with a cache, even a very temporary one, are much better though.
[AD edit]
The discussed images are 5 "featured" images in the frontpage which
will not stay the same for more than an hour max. so isn't it a waste
to create another image copy of every uploaded image just for that?
It is. But you could simply create a separate folder for these thumbnails, and setup a cron job to wipe files older than an hour. Here's an example I use on my site (set to 30 minutes):
*/15 * * * * find /var/www/directory/ -mmin +30 -exec rm -f {} \; >/dev/null 2>&1
Given 'enough' CPU ressources I would prefer to shrink images before sending them to go easy on people with bad connections and mobile devices.
Another option and my preferred strategy would be to keep smaller versions of the images and then use them. If the images are uploaded at some point, then create a smaller version of the image on upload.
It kind of depends on your flow, but I would resize them on-the-fly and save the thumb. So if the thumb exist, serve it, if not resize on the fly and serve that (while saving the thumb).
Then in a cronjob you can remove old images.
How about 3. don't resize the images in the process that's supposed to serve them to client - make a background process do the resizing, send the thumbnails if resized, full images if not yet. This has the advantage that you can throttle the resizing process independently of the user requests.