How to get quality of image in PHP - php

I've been doing some speed optimization on my site using Page Speed and it gives recommendations like so:
Optimizing the following images could reduce their size by 35.3KiB (21% reduction).
Losslessly compressing http://example.com/.../some_image.jpg could save 25.3KiB (55% reduction).
How are the calculations done to get the file size reduction numbers? How can I determine if an image is optimized in PHP?
As I understand it, they seem to base this on the image quality (so saving at 60% in Photoshop or so is considered optimized?).
What I would like to do is once an image is uploaded, check if the image is fully optimized, if not, then optimize it using a PHP image library such as GD or ImageMagick. If I'm right about the number being based on quality, then I will just reduce the quality as needed.
How can I determine if an image is fully optimized in the standards that Page Speed uses?

Chances are they are simply using a standard compression or working on some very simple rules to calculate image compression/quality. It isn't exactly what you were after but what I often use on uploaded images etc etc dynamic content is a class called SimpleImage:
http://www.white-hat-web-design.co.uk/blog/resizing-images-with-php/
this will give you the options to resize & adjust compression, and I think even change image type (by which I mean .jpg to .png or .gif anything you like). I worked in seo and page optimization was a huge part of my Job I generally tried to make the images the size the needed to be no smaller or bigger. compress JS & CSS and that's really all most people need to worry about.

It looks like you could use the PageSpeed Insights API to get the compressions scores: https://developers.google.com/speed/docs/insights/v1/reference, though I'm guessing you'd want to run a quality check/compression locally, rather than sending everything through this API.
It also looks like they've got a standalone image optimizer available at http://code.google.com/p/page-speed/wiki/DownloadPageSpeed?tm=2, though this appears to be a Windows binary. They've got an SDK there as well, though I haven't looked into what it entails, or how easy it would be to integrate into an existing site.

Related

converting image to all sizes on upload vs resizing on request via php

So I have a platform for users which allows them to upload a fair amount of pictures. At the moment, I have my server resizing and saving all images individually to my CDN (so I can pick the best option in order to reduce load time when a user requests to view it), but it seems very wasteful in regards to server storage.
The images are being converted into resolutions of 1200px, 500px, 140px, 40px and 24px.
What I'm wondering is, would it be more efficient to just save the file at 1200px, then serve it via PHP at the requested size using something like ImageMagick? Would there be any major trade-offs and if so, is it worth it?
What I'm doing right now:
https://v1x-3.hbcdn.net/user/filename-500x500.jpg
An example of what I could do:
https://v1x-3.hbcdn.net/image.php?type=user&file=filename&resolution=500
Cheers.
No it's not, because:
you have a small number of sizes
if you will not use caching (image generation on first request only) you can DDOS yourself (image processing its a cpu affected process)
have to do extra work if will use CDN like Cloudflare for HTTP-caching
It makes sense if you have a lot sizes of images, for example, API that supports multiple Andoid/IOS devices, meaning iphone 3 supports 320x320 image only and if you dont have users with such device, your server never creates such image.
Advice:
During image generation, use optimization it reduces image size with imperceptible loss of quality.

Optimize image on page load

Google Page Speed Insights suggests me to optimize webpage images on a webpage I'm currently working on. Images are uploaded from a server. I want to display optimized images on the page but don't want the original image on the server to change. Is there any way to do this in PHP?
You should reduce image size while uploading. Try this code.
reduce image size while uploading using the following PHP code used to upload image
If you don't want the original image to change then you'll find any optimisations you perform to bring the file size down / serve a move appropriately sized image will be redundant due to the overhead of "optimising" the image on the fly. (Unless your original image is just that big)
As I see it, you have 3 options:
Unless it has a massive impact (or you can envisage it will have a massive impact) just ignore Google Page Speeds for now.
You can use lossless compression which will reduce the file size without reducing the quality of the image. This is something you can do on your server with various different apps (just google what type of server you have followed by lossless image compression)
Just create a copy of the original image upon upload (or whenever) and serve that image to your users. The benefits to this are you can have different sized images which can be rendered for different devices, e.g. small image for mobile, and again you can use lossless compression on all of them. The downside to this is you'll obviously be using more server space.
Hope this helps!

How do I compress only images that haven't been compressed already, using just PHP?

I'm currently trying to speed up the websites we develop. The part I'm working on now is to optimise the images so that they are as small (filesize, not dimensions) as possible without losing quality.
Our customers can upload their own images to the website through our custom CMS, but images aren't being compressed or optimised at all. My superior explained this is because the customers can upload their own images, and these images could be optimised beforehand through Photoshop or tools like it. If you optimise already optimised images, the quality would get worse. ...right?
We're trying to find a solution that won't require us to install a module or anything. We already use imagejpeg(), imagepng() and imagegif(), but we don't use the $quality parameter because of reasons previously explained. I've seen some snippets, but they all use imagejpg() and the like.
That being said, is there a sure-fire way of optimising images without the risk of optimising previously optimised images? Or would it be no problem at all to use imagejpeg(), imagepng() and imagegif(), even if it would mean optimising already optimised images?
Thank you!
"If you optimise already optimised images, the quality would get worse. "
No if you use a method without loose.
I don't know for method directly in php but if you are on linux server you can use jpegtran or jpegoptim ( with --strip-all) for jpeg and OptiPNG or PNGOUT for png.
Going from your title, I am going to assume compression
So, lets say a normal jpg of 800x600 is uploaded by your customers.
The customers jpg is 272kb because it has full details and everything.
You need to set tresholds for filesizes at dimensions what is acceptable.
Like:
if $dimensions->equals(800,600) and file_type($image) =='jpg' and file_size($image) > 68kb
then schedule_for_compression($image)
and that way you set up parameters for what is acceptable as an upper limit of file size. If the dimensions match, and the filesize is bigger, then its not optimised.
But without knowing more details what exactly is understood about optimising, this is the only thing I can think of.
If you are using a low number of images to compress, you might find an external service, such as: https://tinypng.com/developers might be of assistance.
I've used their on-line tools for reducing filesize on both JPG and PNG file manually but they do appaear to offer a free API service for the first 500 images per month.
Apologies if this would be better as a comment than an answer, I'm fairly new to stackoverflow and haven't got enough points yet, but felt this may be a handy alternative solution.
Saving a JPEG with the same or higher quality setting will not result in a noticeable loss in quality. Just re-save with your desired quality setting. If the file ends up larger, just discard it and keep the original. Remove metadata using jpegran or jpegoptim before you optimize so it doesn't affect the file size when you compare to the original.
PNG and GIF wont't lose any quality unless you reduce the number of colors. Just use one of the optimizers Gyncoca mentioned.

How to avoid Optimizing images that are already optimized with PHP?

I am currently working on a PHP application which is ran from the command line to optimize a folder of Images.
The PHP application is more of a wrapper for other Image Optimizer's and it simply iterates the directory and grabs all the images, it then runs the Image through the appropriate program to get the best result.
Below are the Programs that I will be using and what each will be used for...
imagemagick to determine file type and convert non-animated gif's to png
gifsicle to optimize Animated Gif images
jpegtran to optimize jpg images
pngcrush to optimize png images
pngquant to optimize png images to png8 format
pngout to optimize png images to png8 format
My problem: With 1-10 images, everything runs smooth and fairly fast however, once I run on a larger folder with 10 or more images, it becomes really slow. I do not really see a good solution around this but one thing that would help is to avoid re-processing images that have already been Optimized. So if I have a folder with 100 images and I optimize that folder and then add 5 new images, re-run the optimizer. It then has to optimize 105 images, my goal is to have it only optimize the 5 newer images since the previous 100 would have already been optimized. This alone would greatly improve performance when new images are added to the image folder.
I realize the simple solution would be to simply copy or move the images to a new folder after processing them, my problem with that simple solution is that these images are used for the web and websites, so the images are generally hard-linked into a websites source code and changing the path to the images would complicate that and possibly break it sometimes.
Some ideas I have had are: Write some kind of text file database to the image folders that will list all the images that have already been processed, so when the application is ran, it will only run on images that are not in that file already. Another idea was to cheange the file name to have some kind of identification in the name to show it has been optimized, a third idea is to move each optimized file to a final destination folder once it is optimized. Idea 2 and 3 are not good though because they will break all image path links in the websites source code.
So please if you can think of a decent/good solution to this problem, please share?
Meta data
You could put a flag in the meta info of each image after it is optimized. First check for that flag and only proceed if it's not there. You can use exif_read_data() to read the data. Writing it maybe like this.
The above is for JPGs. Metdata for PNGs is also possible take a look at this question, and this one.
I'm not sure about GIFs, but you could definitely convert them to PNGs and then add metadata... although I'm pretty sure they have their own meta info, since meta data extraction tools allow GIFs.
Database Support
Another solution would be to store information about the images in a MySQL database. This way, as you tweak your optimizations you could keep track of when and which optimization was tried on which image. You could pick which images to optimize according to any parameters of your choosing. You could build an admin panel for this. This method would allow easy experimentation.
You could also combine the above two methods.
Maximum File Size
Since this is for saving space, you could have the program only work on images that are larger than a certain file size. Ideally, after running the compressor once, all the images would be below this file size, and after that only newly added images that are too big would be touched. I don't know how practical this is in terms of implementation, since it would require that the compressor gets any image below some arbitrary files size. You could make the maximum file size dependent on image size.....
The easiest way would most likely be to look at the time of the last change for each image. If an image was changed after the last run of your script, you have to run it on this particular image.
The timestamp when the script was ran could be saved easily in a short text file.
A thought that comes to my head is to mix the simple solution with a more complicated one. When you optimize the image, move it to a separate folder. When an access is made into the original image folder, have your .htaccess file capture those links and route them to an area of which can see if that same image exists within the optimized folder section, if not, optimize, move, then proceed.
I know i said simple solution, this is a sightly complicated solution, but the nice part is that the solution will provide a scalable approach to your issue.
Edit: One more thing
I like the idea of a MySQL database because you can add a level security (not all images can be viewed by everyone) If thats a need of course. But it also makes your links problem (the hard coded one) not so much a problem. Since all links are a single file of which retrieves the images from the db and the only thing that changes are get variables which are generated. This way your project becomes significantly more scalable and easier to do a design change.
Sorry this is late, but since there is a way to address this issue without creating any files, storing any data of any kind or keeping track of anything. I thought I'd share my solution of how I address things like this.
Goal
Setup an idempotent solution that efficiently optimizes images without dependencies that require keeping track of its current status.
Why
This allows for a truly portable solution that can work in a new environment, an environment that somehow lost its tracker, or an environment that is sensitive as to what files you can actually save in there.
Diagnose
Although metadata might be the first source you'd think to check for this information, it's true that in some cases it will not be available and the nature of metadata itself is arbitrary, like comments, they can come and go and not affect the image in any way. We want something more concrete, something that is a definite descriptor of the asset at hand. Ideally you would want to "identify" if one has been optimized or not, and the way to do that is to review the image to see if it has been based on its characteristics.
Strategy
When you optimize an image, you are providing different options of all sorts in order to reach the final state of optimization. These are the very traits you will also check to come to the conclusion of whether or not it had been in fact optimized.
Example
Lets say we have a function in our script called optimize(path = ''), and let's assume that part of our optimization does the following:
$ convert /path/to/image.jpg -bit-depth=8 -quality=87% -colors=255 -colorspace sRGB ...
Note that these options are ones that you choose to specify, they will be applied to the image and are properties that can be reviewed later...
$ identify -verbose /path/to/image.jpg
Image: /path/to/image.jpg
Format: JPEG (Joint Photographic Experts Group JFIF format)
Mime type: image/jpeg
Geometry: 1250x703+0+0
Colorspace: sRGB <<<<<<
Depth: 8-bit <<<<<<
Channel depth:
Red: 8-bit
Green: 8-bit
Blue: 8-bit
Channel statistics:
Pixels: 878750
Red:
...
Green:
...
Blue:
...
Image statistics:
Overall:
...
Rendering intent: Perceptual
Gamma: 0.454545
Transparent color: none
Interlace: JPEG
Compose: Over
Page geometry: 1250x703+0+0
Dispose: Undefined
Iterations: 0
Compression: JPEG
Quality: 87 <<<<<<
Properties:
...
Artifacts:
...
Number pixels: 878750
As you can see here, the output quite literally has everything I would want to know to determine whether or not I should optimize this image or not, and it costs nothing in terms of a performance hit.
Conclusion
When you are iterating through a list of files in a folder, you can do so as many times as you like without worrying about over optimizing the images or keeping track of anything. You would simply filter out all the extensions you don't want to optimize (eg .bmp, .jpg, .png) then check their stats to see if they possess the attributes your function will apply to the image in the first place. If it has the same values, skip, if not, optimize.
Advanced
If you want to get extremely efficient, you would check each attribute of the image that you plan on optimizing and in your optimization execution you would only apply the options that have not been applied to the command.
Note
This technique is obviously meant to show an example of how you can accurately determine whether or not an image needs to be optimized. The actual options I have listed above are not the complete scope of elements that can be chosen. The are a variety of available options to choose from, and you can apply and check for as many as you want.

Website image compression

I am currently working on a website which needs images to be uploaded. I have seen some standard image uploading tools which work well, but I am looking for one that compresses on the client side before the upload has been done.
I have looked at a view, but none seem to suit. Can anyone recommend a web based client side image compression tool?
Thanks
~ Kyle
Take a look at http://www.plupload.com/. It will resize images and chunk them so your server can handle them. I don't think it will actually compress them but this might solve your problem another way.
I am assuming you are talking mainly about JPG images.
SWFUpload supports client side image resizing. (i.e. reducing the number of Pixels, see Demo here.)
Other than that, compressing the image file (as in, compressing the file's binary data using a Zip algorithm) is not going to do you much good: JPG is extremely tough to compress and usually yields only 2-3 percent's savings.
One image compression tool i will recommend is https://optimizejpeg.com/ .It is a very very powerful online tool and i am using this currently for my website.You can compress 50 images in just few seconds and these images gets saved in a zip folder.You don't need plug in at all for this and hence this helps to speed up your website more without any hassles as we all know plug in creates excess load on your website.With this help of tool you can upload any jpg/png/gif or a ppt file.This tool helps you to compress your image drastically by reducing it to 70-80% with no losses.Colors can also be changed here and there using the existing colors of the images.To your surprise no one can view or identify the changes in the image and hence the look of the image remains the same and also get beautified.
So do enjoy using this tool and share your experiences.

Categories