Server-side image resizing performance - php

I have been noticing more and more from the big sites (Google, Facebook, Youtube and co) that they are resizing images "close" to their desired measurements client side and I am wondering whether this is a shift in the way people are thinking or them being lazy.
Take a scenario of adding a new image size to a standard set of images at particular sizes for a set of products (e-commerce) which number into the 100's of thousands, maybe millions.
Imagine I have a copy of my original image that is 300x350 or whatever and client side, resize it to 200x250. I do this for each product for 20 products on a page.
Is the work and problems server-side of accomodating this new size really worth the benefit client side?
If not then what is a good way to judge when you should pre-process a certain size?
If yes, is there ever a time where server-side processing and caching might become overkill (i.e. housing 8 images of 110x220, 120x230, 150x190 etc)?

Consider following:
Image resizing is heavy process for server. It first of all costly itself. Secondly it is harddrive IO operations which ARE quite slow. SO it all depends on how loaded your server is.
For client it matters in two ways:
1) Thumbnails are smaller in file size and hence are much faster to download. SO they will appear way faster. But that all depends on speed of Internet connection, which day by day is increasing. Have you seen how large images are loaded? They will not be displayed whole at once, but rather by 'lines'
2) If you try to display large image in small size, the quality will be much much much lower. It is because of how browsers process it. They do not have capabilities of Photoshop and cannot do proper quality resizing.
3) Many large images on single page will increase memory usage by that page. On some not so powerful computers that may give terrible lags while scrolling opening it.
As a solution to this question i more tend to do what i have seen in one of Drupal modules (imagecache if i am right).
It does not create thumbnails on image upload. Instead it creates them on request time using .htaccess and mod_rewrite capabilities. It checks if requested file does not exist and if no it redirects request to small and light-weight PHP script which will create the thumbnail, write it to filesystem and then output to client.
So next visitor will already get previously created thumbnail.
Thus you implement postponed/lazy image resizing, which will make the load smoother (stretch it in time).

I decided to test this first on a single image on a page and did some multiplication math to anticipate multiple images. I should note:
I was using pngs for this test however pngs seem pretty standard.
I was using a particular function which apative resizes the image so that the width and height would always fit the original aspect ratio of the image I resized.
I got an image which was 450x450 (roughly) and decided to resize it down, client side, to 200x200 (all measurements in this answer are pixels). I found very little CPU jump by it despite the resize being more than half of the images total size.
The quality was also good across all modern browsers, Chrome, Opera, Firefox and IE all showed the image as clear as if it was done in Photoshop or GD.
On on IE7 I didn't notice much of a CPU jump and the quality was good provided I used a percentile resize based upon the images size constraints.
Overall the additional storage, computation and coding required to make even this size cache server side seemed to be at a disadvantage to the power that can be implied on the user end.
That being said, if I were to start doing multiples of this type of resizing (say 20, as in my question) I would probably start to hit problems.
After some tweaking around I found that anything under 1/3 of the original images size, provided the image was under 1000px in width or height, seemed to be negliable to CPU and general computer performance.
With the added function I was using I got just as good quality from resizing client side as I did server side. The specific function I used (for interested parties) was:
function pseudoResize($maxWidth = 0, $maxHeight = 0){
$width = $this->org_width;
$height = $this->org_height;
$maxWidth = intval($maxWidth);
$maxHeight = intval($maxHeight);
$newWidth = $width;
$newHeight = $height;
// Ripped from the phpthumb library in GdThumb.php under the resize() function
if ($maxWidth > 0)
{
$newWidthPercentage = (100 * $maxWidth) / $width;
$newHeight = ($height * $newWidthPercentage) / 100;
$newWidth = intval($maxWidth);
$newHeight = intval($newHeight);
if ($maxHeight > 0 && $newHeight > $maxHeight)
{
$newHeightPercentage = (100 * $maxHeight) / $newHeight;
$newWidth = intval(($newWidth * $newHeightPercentage) / 100);
$newHeight = ceil($maxHeight);
}
}
if ($maxHeight > 0)
{
$newHeightPercentage = (100 * $maxHeight) / $height;
$newWidth = ($width * $newHeightPercentage) / 100;
$newWidth = ceil($newWidth);
$newHeight = ceil($maxHeight);
if ($maxWidth > 0 && $newWidth > $maxWidth)
{
$newWidthPercentage = (100 * $maxWidth) / $newWidth;
$newHeight = intval(($newHeight * $newWidthPercentage) / 100);
$newWidth = intval($maxWidth);
}
}
return array(
'width' => $newWidth,
'height' => $newHeight
);
}
So it does seem from my own testing that housing every size of every image your going to use, i.e. as I asked in my question:
If yes, is there ever a time where server-side processing and caching might become overkill (i.e. housing 8 images of 110x220, 120x230, 150x190 etc)?
Does seem to be overkill in modern computing and rather you should go for close measurements if you intend to use many different size of many images.
I have found, however, that if you have a standard set of sizes and they are small then the advantage is actually to server side resize and storage of all sizes since forcing the client to resize will always slow their computer down a little but scaling under 1/3 of it's original size seems to not make too much of a difference.
So I believe that the reason why sites such as FB and Google and Youtube don't worry too much about storing exact measurements of all their images is because "close-to-measurement" scaling can be more performant overall.

Related

PHP - Compress image to pre-defined size

Im working on an application that allows users to upload images. We are setting a maximum file size of 1MB. I know there are a few options out there for compressing JPEGs (and we are dealing with jpegs here).
From what ive seen, all of these functions just allow you to define a compression ratio and scale it down. Im wondering if there is a function that allows you to define a maximum file size and have it calculate the compression ratio necessary to meet that size.
If not, I was thinking my best approach would be to use a while loop that looks at size and just keep hitting the image with imagejpeg() in 10% increments until the file is below the pre-defined max-size.
Am I on the right track here?
It depends on the data but with images you can take small small samples. Downsampling would change the result. Here is an example:PHP - Compress Image to Meet File Size Limit.
I completed this task using the following code:
$quality = 90;
while(filesize($full_path) > 1048576 && $quality > 20) {
$img = imagecreatefromjpeg($full_path);
imagejpeg($img,$full_path,$quality);
$quality = $quality - 10;
clearstatcache();
}
if(filesize($full_path) > 1048576) {echo "<p>File too large</p>"; unlink($full_path); exit;}
The $quality > 20 part of the statement is to keep the script from reducing the quality to a point I would consider unreasonable. Im sure there is more to be done here. I could add in a resolution re-size portion as well, but this works for my current needs. This is with a max file size of 1MB. If the file is still too large after maximum quality scale, it returns a file too large error and deletes the image from the server.
Take note that clearstatcache() is very important here. Without this, the server caches the image size and will not notice a change in file size.
This script only applies to JPEGs, but there are other php functions for gifs, pngs, etc.

optimized way to scale image

I have the following function that runs thought about 25 times and delays the site's load time by 10 seconds or more. What the code is essentially doing is working out the height when the image's width is scaled down or up to 310px. Any suggestions on how I could improve my code or suggest another option? Maybe jQuery might be better for this?
function img_height($image){
$inputwidth = 310;
list($width,$height) = getimagesize($image);
if($width !== $inputwidth){
$outputheight = ($inputwidth * $height)/ $width;
}elseif($width == $inputwidth){
$outputheight = $height;
}
return 'style="height:'.$outputheight.'px;" ';
}
#Enigmo - I have worked a lot with image loading and dynamically changing sizes. You really cannot make a big difference in the loading time using PHP. I would suggest you to use AJAX and preload the images or do a lazy loading. That way your site gets loaded first and then the images keep showing up as and when they are loaded.
I would suggest storing images sizes alongside with the image names in some db structure (cache). Then, you would know already all the sizes and your site would work blazingly fast.
you can simply use jquery stuff to do this stuff as this will be getting fired at client browser and not making any load/processing on server side. also jquery is faster than php processing also

Resizing lots of images via PHP, keeping aspect and not using all my RAM?

I have a problem that I thought was easily solved, but is turning out a little more difficult than anticipated.
I am working on a site which someone can upload images for a product, and it resizes them twice (500x500 and 150x150) on the fly. I am trying to keep the aspect ratio, for example if I had 600x500 image it would resize to 500x417 and 150x125.
I have found (lots) of code that does this, such as this class, however I always run into memory issues. I think the class isn't unloading/destroying the temporary images after resizing.
I had the host upgrade my memory-limit to 64M from 32, however I run out of memory later. I'm stuck with PHP 5.2.13 as well, so no garbage collector for me.
My PHP is fairly new, so it's very alien to me how classes work, otherwise I would try adding the imagedestroy() function where required. I fiddled with the linked class for hours without success.
Can anyone either:
-Point me in the direction of a magic class that keeps aspect as well as unloading/destroying temporary images after resizing is complete, to keep RAM usage under control?
-Give me some (much needed!) guidance on where and how I could modify a class (such as the one posted) to destroy temporary images.
-Tell me if I'm going around this extremely wrong?
Oh, and I had the WideImage class working at one point, and it worked great, then suddenly stopped. I spent hours trying to fix it with no success.
My PHP install has GD, but not ImageMagick.
Thanks!
Well, I managed to solve my own problem.
I sat down, wiped all my resize code, and started from scratch, and wrote this wonderful little function to do what I needed.
function resizeimage($targetw, $targeth, $input, $savedest) {
list($oldw, $oldh) = getimagesize($input);
$imgratio = ($oldw / $oldh);
if ($imgratio>1) {
$new_width = $targetw;
$new_height = ($targeth / $imgratio);
} else {
$new_height = $targeth;
$new_width = ($targetw* $imgratio);
}
$imagetemp = imagecreatetruecolor($new_width, $new_height);
$imageorig = imagecreatefromjpeg($input);
imagecopyresampled($imagetemp, $imageorig, 0, 0, 0, 0, $new_width, $new_height, $oldw, $oldh);
imagejpeg($imagetemp, $savedest, 95);
imagedestroy($imageorig);
imagedestroy($imagetemp);
unset($oldw,$oldh,$imgratio,$new_height, $new_width, $imagetemp, $imageorig);
}
Feed in target width and height, as well as the location of the input image (eg "uploads/images/testimage.jpg"), and where you want it saved ("uploads/images/resizedtestimage.jpg").
Hope this snippet helps someone in the future!
Here is some GD image processing examples:
http://fdcl.svn.sourceforge.net/viewvc/fdcl/trunk/modules/imageprocessing/gd/module.inc?revision=212&view=markup
One approach is this technique:
Open image Scale to 500
imagedestroy($this->original_image)
Use image scaled to 500 and scale down to 150
imagedestroy($this->image500)
imagedestroy($this->image150)
The last idea is to use any available command-line tools to do the resizing for you.
I have done something similar but used 2 apps that can be run from command line on a Linux server.
ImageMagick Mogrify: http://www.imagemagick.org/www/mogrify.html
mogrify -resize 750 "image.jpg"
will resize to 750 pixels wide
jpegoptim: http://freecode.com/projects/jpegoptim
jpegoptim "image.jpg" –max=70 --strip-all
will optimise jpeg to 70% and strip all unnecessary bloat from image.
Both run from CLI and offer a lot of command options, sure you could use either or both with not much difficult.

Is there a way to retrieve or derive raw SoundCloud API waveform data?

I’m creating a web application that uses the SoundCloud API to stream tracks of artists. I know how I can get the waveform PNG image (http://w1.sndcdn.com/fxguEjG4ax6B_m.png for example), but I actually need some sort of wave-data (when in the song is it high and when is it low?).
I don’t have access to an audio library like LAME or something like that, because my web hosting doesn’t allow it. Is it possible to
Get this data directly from the SoundCloud API in some way.
Process the waveform PNG image in PHP or JavaScript to retrieve the needed data? (And is there maybe some sort of library available for this kind of processing?)
Soundcloud starts to provide the floating points but it’s not official yet. Just a little trick, when you have your PNG:
https://w1.sndcdn.com/XwA2iPEIVF8z_m.png
Change "w1" by "wis" and "png" by "json":
https://wis.sndcdn.com/XwA2iPEIVF8z_m.json
And you get it!
It's possible to parse the waveform PNG image to translate it to an array of points. The images are vertically symmetrical and to find the peaks you only need to inspect the alpha values to count how many opaque pixels it is from the top of the image. This is how the waveforms are rendered for the widget and on the Next SoundCloud.
In PHP, you could use ImageMagick or the GD Graphics Library to read these values, and in Javascript, it's possible by putting the image onto a canvas object and then inspecting the image data from there. I won't go too much into the details of these, but you could certainly ask another question if you get stuck.
While there is no official way to get the raw waveform data directly from a SoundCloud API request, there is a way to derive the exact same data SoundCloud reveals in the unofficial endpoint (aka: Something like https://wis.sndcdn.com/XwA2iPEIVF8z_m.json) in PHP using this code like this. Simply change the value of $image_file to match whatever SoundCloud 1800 wide by 280 high PNG image you have and you are good to go:
$source_width = 1800;
$source_height = 140;
$image_file = 'https://w1.sndcdn.com/XwA2iPEIVF8z_m.png';
$image_processed = imagecreatefrompng($image_file);
imagealphablending($image_processed, true);
imagesavealpha($image_processed, true);
$waveform_data = array();
for ($width = 0; $width < $source_width; $width++) {
for ($height = 0; $height < $source_height; $height++) {
$color_index = #imagecolorat($image_processed, $width, $height);
// Determine the colors—and alpha—of the pixels like this.
$rgb_array = imagecolorsforindex($image_processed, $color_index);
// Peak detection is based on matching a transparent PNG value.
$match_color_index = array(0, 0, 0, 127);
$diff_value = array_diff($match_color_index, array_values($rgb_array));
if (empty($diff_value)) {
break;
}
} // $height loop.
// Value is based on the delta between the actual height versus detected height.
$waveform_data[] = $source_height - $height;
} // $width loop.
// Dump the waveform data array to check the values.
echo '<pre>';
print_r($waveform_data);
echo '</pre>';
The benefit of this method is while that https://wis.sndcdn.com/ URL is useful, there is no telling if/when SoundCloud would change the structure of the data coming from it. Deriving the data from the official waveform PNG offers some long term stability since they are not just going to change that PNG image without fair warning to SoundCloud API end users.
Also, note that while the $source_width is 1800 the $source_height is 140 because while the SoundCloud PNG file is 280 pixels high, the bottom half is basically just a flipped/mirrored copy of the top half. So just measuring the values from 0 to 150 will give you the necessary waveform data values.
Sorry to bump an old thread - just in case you are looking for something similar and stumbles across this post: This is now possible as per this link: Waveforms, Let's Talk About Them.
It was published shortly after this thread - so again apologies for bumping an old one.

How to optimize images in PHP?

I've a website where users can save their profile image (avatar).
I'd like to make some optimization to the image when they load it.
I mean, it's an avatar image, doesn't need full resolution, nor great size.
What can i do? I've been thinking:
Resize it
Low the quality
Possible:
convert it to GIF
Color fill to transparent PNGs
There are some library better (simpler) than GD to do this?
Thanks a lot!
GD is how this is done. It sounds like a simple operation, but there are a number of factors that you really want to get right if you're going to do this. All in all, this winds up being several hundred lines of code to take care of everything.
My recommendation is that although you may wish to resize an image (which requires a lossy recompression if using JPEG), converting it to a GIF is a bad idea. You don't know what the source type is, so doing that is problematic.
Here's my recommended flow:
1) Resize the image to your output format. You can force a cropping aspect ratio here as well, if you want.
2) Determine original source mode:
8 bit indexed (GIF/PNG8): Save as PNG8 (format tends to be smaller than GIF).
16-24 bit: Save as JPG. Quality is up to you, but 70% is a good baseline.
32 bit (PNG24): Save as PNG24, taking care to maintain the transparency.
Note, this solution pretty much destroys any 'animated' gifs, but... that's what happens when you try to resize an animated gif.
Although... I also highly recommend to NOT do this as a single stage process and removing the original files. This is the kind of thing that will only come back to bite you later.
Disk space is cheap these days... far better to store the original in a high quality format (even at 2K x 2K resolution), then create an image service which will serve the resolution/quality you need and cache the result.
You could use the Asido imaging library for PHP to resize your images. This library makes use of GD though. Here is some example usage code.
The resizing and other imaging operations are preferably done after the uploading of new images (except if you want to save the higher resolution for some other purpose).
<p>
//This function will proportionally resize image
function resizeImage($CurWidth,$CurHeight,$MaxSize,$DestFolder,$SrcImage,$Quality,$ImageType)
{
//Check Image size is not 0
if($CurWidth <= 0 || $CurHeight <= 0)
{
return false;
}
//Construct a proportional size of new image
$ImageScale = min($MaxSize/$CurWidth, $MaxSize/$CurHeight);
$NewWidth = ceil($ImageScale*$CurWidth);
$NewHeight = ceil($ImageScale*$CurHeight);
$NewCanves = imagecreatetruecolor($NewWidth, $NewHeight);
// Resize Image
if(imagecopyresampled($NewCanves, $SrcImage,0, 0, 0, 0, $NewWidth, $NewHeight, $CurWidth, $CurHeight))
{
switch(strtolower($ImageType))
{
case 'image/png':
imagepng($NewCanves,$DestFolder);
break;
case 'image/gif':
imagegif($NewCanves,$DestFolder);
break;
case 'image/jpeg':
case 'image/pjpeg':
imagejpeg($NewCanves,$DestFolder,$Quality);
break;
default:
return false;
}
//Destroy image, frees memory
if(is_resource($NewCanves)) {imagedestroy($NewCanves);}
return true;
}
}
</p>
I'd pick some standard avatar image sizes you'll need for your page, like
medium size for a profile page, if you have one and
small size which appears next to the user's post
you get the idea, just what you need
And when the user uploads a new avatar, you convert it to the formats you'll need with a reasonable quality setting. I'm assuming you're going for JPEGs, because this is a good catch-all format for this use case. PNGs do poor with photographic content, JPEGs are not so great for drawings, but then most avatars you see are photos. I wouldn't use GIFs any more these days, they limit to 256 colors and have only a 1-bit alpha channel.

Categories