I'm wondering how to figure out the best compress rate (small filesize + no quality loss) automatically.
At the moment I'm using imagejpeg() with $quality = 85 for each .jpg.
PageSpeed (Chrome Plugin) suggests, to lower the quality of a few images to save some kb. The percentage of reduction is different.
I'd like to write a cronjob that crawls a specific directory and optimizes every image.
How does PageSpeed or TinyPNG figure out the best optimized quality and is this possible with PHP or another serverside-language?
TinyPNG uses pngquant.
Pngquant has option to set desired quality, similar to JPEG. You can run something like:
<?php system('pngquant --quality=85 image.png'); ?>
Pngquant website has example code showing how to use pngquant from PHP.
For JPEG you can apply lossless jpegcrush.
JpegMini (commercial) and jpeg-archive (free) are lossy and can can automatically find a minimal good quality for a JPEG.
In PHP you can roughly estimate how much JPEG was compressed by observing how much file size changes after re-compression. File size of JPEG recompressed at same or higher quality will not change much (but will lose visual quality).
If you recompress JPEG and see file size halved, then keep the recompressed version. If you see only 10-20% drop in file size, then keep the original.
If you're compressing yourself, use MozJPEG (here's an online version).
Related
I want to convert every uploaded image to png. I think gd is a good approach for that, because it drops the metadata of the images, and does not try to parse it. I read that ImageMagick maybe has malware vulnerability on some linux servers...
I have 2 questions:
Does gd drop the PNG metadata too if the original file was PNG, or should I use pngcrush after converting?
Do I have the same quality loss as by saving JPEG files, or is the PNG format much better?
If you would open the png in PHP, copy it to a new resource, then save it at full quality (in PNG, JPEG will lose quality) it will do what you want.
I use Imagine library and I would like to know if is it possible to compress images with this library?
Other question:
Do you think it's a good solution to compress a lot (30 images) of images with the library in order to use a command line?
You can save the images with lesser quality to compress them:
To save at 50% quality:
$imagine->open('/path/to/image.jpg')->save('/path/to/image.jpg', array('quality' => 50));
And as for whether or not to do the compression at command line, you can do it but I don't recommend it. Image manipulation takes a lot of cpu and ram, so I suggest you download the images and manipulate them on your computer (not on the production server), or do it in a php script but limit the number of images to compress.
I process uploaded images by php to save (after resize) by imagejpeg. As I explored, imagejpeg is the best php command to compress jpg images to reduce the file size. However, when I check my website by Google Page Speed, it says all of my images can be compressed 4-10%.
What is the common method to compress images to meet the Google standard?
Googles "standard" is the possible maximum to be expected by google. You need to use highly optimized image compressors that does nothing else than image compressing and therefore get a possible maximum best value here.
You can for example, open you jpeg file in a image editor like adobe photoshop and create the possible maximum best compression to be expected by you while having visual control. Highly recommended.
The GD library provides a standard conform jpeg compression which should match a library users expectation, but which might not satisfy a graphic designer (and/or google by 4-10%).
I have a resize script i made in PHP that uses GD (my VPS doesn't have imagemagick installed) to resize an image, I have recently started getting memory errors so i have increased the memory_limit up to 50Mb and still get memory error.
The image I am trying to resize is only 2Mb, is this correct for PHP image stuff, something sounds a bit wrong to me.
To resize the image GD has to work on the uncompressed image, which is significantly larger than 2MB, I assume. Imagemagick needs to store the entire image data (pixels * bit_depth) and quite some more for the actual work in memory.
50 Megabytes is not much for working with images. For example, Drupal warns you if you have less than 96MB memory limit, if you have the image resizing, etc.. enabled. For reasonably sized images 64MB are enough in my experience, but if you put full size images from a digital camera you'll run into problems with that memory limit.
On my shared/cloud hosting (2.7£/month) I still did not see any warning/error when resizing images. I set the limit to 200MB (sometimes users need to upload very large images). As Fabian said, I guess 50 is too low.
I'm wondering how to compress an PNG image correctly.
The situation is this :
I have a PNG image compressed and color-reduced with Irfanview on Windows. It's about 20KB.
When my portal software resizes (using magickwand 1.0.7) it with default values, it's about 63K (!).
Next try was to call MagickSetImageDepth($this->_imageHandler,8), resulting in a filesize of 34K, which is better, but still it's bigger than the (larger in dimension) original file.
None of the documented functions seem to fit to further compress the image.
Any hint would be greatly appreciated !
Greetz,
Sosa
PNG compression programs and routines use different techniques. I've found out that many times, an image that's already been compressed (or saved efficiently) cannot be compressed further or even has a higher file size as you are experiencing.
In your case I'd say your images cannot be compressed further, at least using MagickWand. You might just want to leave out that step.
Perhaps optimizing your PNGs before runtime would be a solution. There are many options available in this case. I've had luck with PNGGauntlet. You can run a batch job on PNGGauntlet and it will skip over the files that it would've made larger, if any.
Try this tool by Yahoo - it's great!
http://developer.yahoo.com/yslow/smushit/