I have a function in PHP that resizes images into a thumbnail, my image upload script takes an uploaded image and runs this function to resize the image if it is wider then 700px it then also runs the function 2 more times to create 2 different sized thumbnail images, so there is a total of 3 images saved each time a user uploads an image. My resize/thumbnail function is called 2 times for the thumbnails and an occasional 3rd time if the file is to wide in dimensions.
Now this resizing function uses getimagesize( ) to get the dimensions, so my uplaod script calls this function, then the resizing function uses the getimagesize( ) function 2-3 more times to make other sized images.
I am thinking that I should just pass the dimesions onto the resize function since I get them in the uploading process?
My real question is, is getimagesize( ) a resource hungry functon, would it be best to use it at least as possible or is calling it a few times when 1 image is uploaded fine?
Just a tip/precausion, I asssume you're using GD functions. When creating more than one thumbnail, the usual bottleneck is wrongly implemented image resizing functions - each time reading the original image and then saving the resized one. A better way is to load the image once, and use the image resource to make all thumbnails with imagecopyresampled - don't only pass the dimentions of the image to the function - pass also the GD reference. Thay way your original file gets loaded only once.
For something that runs only on upload, it shouldn't bother so much. Uploading is not a action where the user expects super fast response. Premature optimization is the root of all evil.
That being said, getimagesize() is not particularly costly, but if you can call it just once do so. But I don't predict that much of a speed increase. The costly part of your script is the image resizing itself.
It's not particularly resource hungry - it has to open a file and read the image header.
Don't go out of your way to optimize it away - if it's easy, do it. Otherwise, wait and see where the real bottlenecks are in your application before optimizing.
The best thing to do is to profile your scripts.
Instead of theoretical answers which may not apply to a specific situation, you get a real answer and it is really instructive.
Also, with this habit, you will be able to :
discover bottlenecks
make the difference between a micro optimization and a major optimization.
I personally dev. on Windows and deploy on *nix.
On my dev dox, I use xdebug + WinCacheGrind to read the results.
I could not live without them. :)
http://elrems.wordpress.com/2008/02/12/profiling-php-with-xdebug-and-wincachegrind/
I don't think the uploading part should be where you resize the image. You should resize the image at a later time as a cron job. You can use a third party application like imagemagick or some other resizing application to resize images. That way you save time on the front end. You can run the resize job every 5 minutes or so.
Related
I need a image resize system for a project so I decided to upload original images into DB encoded with with base64, I would like to keep stored only original images since they are easy to manage(add/edit/delete).
The output is made dynamically with php and IMagick atm.
The main problem I met is the output slow time, specifically the processing time needed it too long because I use resize+compress.
I need this compression because my visitors have slow internet connection and sometimes images are really big for just a preview.
Alternative solutions I though is to store in DB some resized images but won't be eficient because it will take more space and images size will change over time.
So, my question is: is there a method to deliver images dynamically faster? How?
You won't get around caching the resized images in some way. The thumbnails don't need to be stored in the data base; you could also write them to disk. Only recompute the resized versions if the original image has changed.
This question already has answers here:
PHP image resize on the fly vs storing resized images
(4 answers)
Closed 9 years ago.
Problem - I wanted to set up an image-uploading feature in my website. But I wanted to show both- the original image and a small thumbnail of the image.
Choices - Which way is better - to create a separate image (thumbnail) in a directory when the image is uploaded or to show a smaller version by reducing its height and width in the fixed ratio every time the image is requested?
How I am doing it currently - The later one sounds better to me because it won't be taking much size on the disk but it has to resize the image again and again. Which one do you think is better?
This is a general question for web application, no language in specific.
Any idea how facebook or google do it?
Question - My question is how to generate thumbnails and show them on a website - by creating a copy of the original image with smaller dimension or by generating the thumbnail dynamically every time it is requested.
Creating the thumbnail on upload is almost always the better option. If storage is a concern you could convert them on the request and then cache the result in a memory store. If its requested again before the cache expires, no conversion will be needed.
Storage is often pretty cheap, so I would probably not go for that extra complexity.
Just create a thumbnail version and save to disk. Hard disk space is very cheap. £70 for a couple of TB.
"Better" depends on the criteria you set.
For most applications, disk space is not an issue - and if storing a thumbnail is a problem, storing the original must be a huge concern - a decent digital camera photo will run to many megabytes, whereas the thumbnail should not exceed 50K.
Bandwidth and performance (as perceived by the client) are usually bigger concerns. If you have lots of people browsing a gallery of image thumbnails, serving 50Kb thumbnails will be significantly faster (and cheaper in bandwidth) than serving multi-megabyte high resolution images.
In addition, by serving thumbnails on a URL like <img src="images/thumbnail/foobar.jpg"> and setting appropriate cache headers, you should get a lot of downstream caching - this is less likely if you serve the image as <img src="thumbnail.php?image=image/foobar.jpg> because caches treat querystrings rather conservatively.
I used to work on a web site that managed hundreds of thousands of product images; we set up ImageMagick to create thumbnails automatically. Depending on your setup, it may make sense to do this when the thumbnail is first requested, rather than when the file is uploaded, because the conversion can be fairly resource hungry, and doing it at upload time would take longer than we wanted to wait. Modern hardware may make that a non-issue.
There's also a question about keeping the thumbnails in sync with the originals - if the user uploads a new image, you have to ensure you get the thumbnail updated; if the original is deleted, you must also delete the thumbnail.
Creating a thumbnail is a better option and it doesn't cost much disk space. Your client will also load smaller size when opening your pages. converting the image upon request will cost event more time to load your page ;)
If you take a look at most CMS with this built in functionality they nearly always create a thumbnail image of the image on upload and store it on the server.
This goes back to the age old saying of "do what google does" but with CMS.
Coming from Rails background, I used to work with paperclip plugin, which was creating from images attachments thumbs with predefined sizes.
In wordpress, I am little confused. Here is my question or points that need more clarification:
Does timthumb creates thumbs and saves it to disk upon uploading images for the first time? or It just resizes images on the fly and caches them?
If it resizes on the fly, why I see different sizes of each image in uploads directory, such as filename.jpg filename-150x150.jpg and so on?
Isn't better for performance to just create the thumbs once upon uploading and serving them directly without calling a script? and if so, how to implement this?
Typically it resizes and caches the images upon their first request.
You're probably seeing the image resizing that WP does. These are controlled in the settings
Debatable. Yes in that it could all be done at your command, no in that someone could upload 1000 images and resizing them all at once could cause problems with the site. Thus spreading out the resizing could result in balancing the load demands. Also, parameters can be passed via the script when a page loads that creates a custom thumbnail. So if you decide your thumbnails were 10px too narrow, you can run it again and it will resize from the original. Plus the filename of the base image remains untouched - if your code says image.jpg, it will always be image.jpg, not matter the size. So if you've got 10,000 instances of thumbnails, and they all reference image-150x150 and now you want them to be 160x160, you either have to change the image names being referenced, or have a nonsensical filename. TimThumb provides a pretty good workaround for that.
Here's some basic timthumb/wordpess performance tips http://www.dollarshower.com/timthumb-and-wordpress-blog-performance/
I'm currently working on a portfolio site wherein I would have to load a large quantity of photos on the same page.
These images are loaded dynamically through PHP, and at the moment I have opted to save thumbnail versions of these images beforehand and load these.
My concern, however, is that this might not be an optimal solution should the site have a large number of users: I would basically have duplicates of each image multiplied by the number of images users have uploaded.
My question for you is whether or not there are better solutions to this problem? A way to load a page as fast as possible without compromising too much space?
Thanks a ton.
Loading a webpage full of lots of images will be slow. The reason for this is because of the bandwidth needed to transfer these images.
You have a couple of options
Load full images in tiled mode. These images will be full images, just resized to fit in a "thumbnail" view. The advantage of this is that you are only saving 1 image, but that image is full sized, and will take a long time to load.
Load the thumbnails as you said you are doing. The advantage to this is performance, but you need to store two copies of each image. Also, depending on how you tackle the thumbnail creation, you may require users to upload two copies of the image to provide their own thumbnail... which could stink.
Load thumbnails, but dynamically generate them on upload. You are essentially keeping two copies of the image on disk, but you are dynamically creating it through some php image modification API. This will load faster, but still eats up disk space. It also minimizes user/administrative requirements to provide a thumbnail.
Load thumbnails on-demand, as the page is requested. This approach would take some testing, as I've never tried it. Basically, you would invoke the php image modification API (or better yet out-source to a native solution!) to create a one-time-use (or cached) thumbnail to be used. You might say "OMG, that'll take so long!". I think this approach might actually be usable if you apply an appropriate caching mechanism so you aren't constantly recreating the same thumbnails. It will keep down on bandwidth, and since the limiting factor here is the network connection, it might be faster then just sending the full images (since the limiting factor of creating the thumbnails would now be CPU/Memory/Hard Disk).
I think #4 is an interesting concept, and might be worth exploring.
What i think is :
A. Take Advantage of Cache (System Cache , Cache In Headers , HTTP Cache .. all Cache )
B. Don't generate Thumb all the time
C. Use a Job Queing System such as Gearman or beanstalkd to generate thumbs so that you don't have to do it instantly
D. Use Imagick is more efficient
E. Paginate
F. Example only generate thumb when the original file has been modified
$file = "a.jpg" ;
$thumbFile = "a.thumb.jpg" ;
$createThumb = true;
if(is_file($thumbFile))
{
if((filemtime($file) - 10) < filemtime($thumbFile));
{
$createThumb = false;
}
}
if($createThumb === true)
{
$thumb = new Imagick();
$thumb->readImage($file);
$thumb->thumbnailImage(50, null);
$thumb->writeImage($thumbFile);
$thumb->destroy();
}
Consider using a sprite or a collage of all the images, so that only one larger image is loaded, saving bandwidth and decreasing page load time.
Also, as suggested already, pagination and and async loading can improve it some.
References:
http://css-tricks.com/css-sprites/
http://en.wikipedia.org/wiki/Sprite_(computer_graphics)#Sprites_by_CSS
Yes, caching the thumbnails is a good idea, and will work out fine when done right. This kind of thing is a great place for horizontal scaling.
You should look into using a
CDN. (Or
possibly implementing something similar.) The caching system will
generate images of commonly-requested sizes, and groom those off if
they become infrequently requested at a later time.
You could also scale horizontally and transplant this service to another server or
vserver in your network or cloud.
It is a disk-intensive and bandwidth-intensive thing, image delivery. Not as bad as video!
Generate thumbnail of large images on-the-fly. (Some hint)
Load images asynchronously (Try JAIL)
Paginate
Caching is also an option, works from the second time the site is loaded though.
I'm building an image gallery which present a few images in the frontpage. The images are larger than the actual size displayed in the frontpage, which leads me to the following question:
If cache is not an option, what would be better:
Using php to shrink the image and send it to the client.
Send the original full size image and let the client shrink it (with simple width and height attributes)
I tend to think that the second is a better solution, but I'd like to hear more opinions.
Thanks!
Edit:
When people upload the images, I create thumbnails for them to be displayed when browsing the site.
The "cache is not an option" reason:
The discussed images are 5 "featured" images in the frontpage which will not stay the same for more than an hour max. so isn't it a waste to create another image copy of every uploaded image just for that?
Essentially, it depends on
What's the original-to-desired width/height ratio? It's not a big deal serving a 500x500 image and showing it as 250x250, but wasting bandwidth on 1920x1080 images is. Also, mobile devices might not have enough resources available to actually display the webpage if you serve too many big images.
What do you have more of: bandwidth or CPU power? Can you make sure nobody uses your on-the-fly resizer as DOS target?
Generally solutions with a cache, even a very temporary one, are much better though.
[AD edit]
The discussed images are 5 "featured" images in the frontpage which
will not stay the same for more than an hour max. so isn't it a waste
to create another image copy of every uploaded image just for that?
It is. But you could simply create a separate folder for these thumbnails, and setup a cron job to wipe files older than an hour. Here's an example I use on my site (set to 30 minutes):
*/15 * * * * find /var/www/directory/ -mmin +30 -exec rm -f {} \; >/dev/null 2>&1
Given 'enough' CPU ressources I would prefer to shrink images before sending them to go easy on people with bad connections and mobile devices.
Another option and my preferred strategy would be to keep smaller versions of the images and then use them. If the images are uploaded at some point, then create a smaller version of the image on upload.
It kind of depends on your flow, but I would resize them on-the-fly and save the thumb. So if the thumb exist, serve it, if not resize on the fly and serve that (while saving the thumb).
Then in a cronjob you can remove old images.
How about 3. don't resize the images in the process that's supposed to serve them to client - make a background process do the resizing, send the thumbnails if resized, full images if not yet. This has the advantage that you can throttle the resizing process independently of the user requests.