I am currently working on a website which needs images to be uploaded. I have seen some standard image uploading tools which work well, but I am looking for one that compresses on the client side before the upload has been done.
I have looked at a view, but none seem to suit. Can anyone recommend a web based client side image compression tool?
Thanks
~ Kyle
Take a look at http://www.plupload.com/. It will resize images and chunk them so your server can handle them. I don't think it will actually compress them but this might solve your problem another way.
I am assuming you are talking mainly about JPG images.
SWFUpload supports client side image resizing. (i.e. reducing the number of Pixels, see Demo here.)
Other than that, compressing the image file (as in, compressing the file's binary data using a Zip algorithm) is not going to do you much good: JPG is extremely tough to compress and usually yields only 2-3 percent's savings.
One image compression tool i will recommend is https://optimizejpeg.com/ .It is a very very powerful online tool and i am using this currently for my website.You can compress 50 images in just few seconds and these images gets saved in a zip folder.You don't need plug in at all for this and hence this helps to speed up your website more without any hassles as we all know plug in creates excess load on your website.With this help of tool you can upload any jpg/png/gif or a ppt file.This tool helps you to compress your image drastically by reducing it to 70-80% with no losses.Colors can also be changed here and there using the existing colors of the images.To your surprise no one can view or identify the changes in the image and hence the look of the image remains the same and also get beautified.
So do enjoy using this tool and share your experiences.
Related
Google Page Speed Insights suggests me to optimize webpage images on a webpage I'm currently working on. Images are uploaded from a server. I want to display optimized images on the page but don't want the original image on the server to change. Is there any way to do this in PHP?
You should reduce image size while uploading. Try this code.
reduce image size while uploading using the following PHP code used to upload image
If you don't want the original image to change then you'll find any optimisations you perform to bring the file size down / serve a move appropriately sized image will be redundant due to the overhead of "optimising" the image on the fly. (Unless your original image is just that big)
As I see it, you have 3 options:
Unless it has a massive impact (or you can envisage it will have a massive impact) just ignore Google Page Speeds for now.
You can use lossless compression which will reduce the file size without reducing the quality of the image. This is something you can do on your server with various different apps (just google what type of server you have followed by lossless image compression)
Just create a copy of the original image upon upload (or whenever) and serve that image to your users. The benefits to this are you can have different sized images which can be rendered for different devices, e.g. small image for mobile, and again you can use lossless compression on all of them. The downside to this is you'll obviously be using more server space.
Hope this helps!
I'm currently trying to speed up the websites we develop. The part I'm working on now is to optimise the images so that they are as small (filesize, not dimensions) as possible without losing quality.
Our customers can upload their own images to the website through our custom CMS, but images aren't being compressed or optimised at all. My superior explained this is because the customers can upload their own images, and these images could be optimised beforehand through Photoshop or tools like it. If you optimise already optimised images, the quality would get worse. ...right?
We're trying to find a solution that won't require us to install a module or anything. We already use imagejpeg(), imagepng() and imagegif(), but we don't use the $quality parameter because of reasons previously explained. I've seen some snippets, but they all use imagejpg() and the like.
That being said, is there a sure-fire way of optimising images without the risk of optimising previously optimised images? Or would it be no problem at all to use imagejpeg(), imagepng() and imagegif(), even if it would mean optimising already optimised images?
Thank you!
"If you optimise already optimised images, the quality would get worse. "
No if you use a method without loose.
I don't know for method directly in php but if you are on linux server you can use jpegtran or jpegoptim ( with --strip-all) for jpeg and OptiPNG or PNGOUT for png.
Going from your title, I am going to assume compression
So, lets say a normal jpg of 800x600 is uploaded by your customers.
The customers jpg is 272kb because it has full details and everything.
You need to set tresholds for filesizes at dimensions what is acceptable.
Like:
if $dimensions->equals(800,600) and file_type($image) =='jpg' and file_size($image) > 68kb
then schedule_for_compression($image)
and that way you set up parameters for what is acceptable as an upper limit of file size. If the dimensions match, and the filesize is bigger, then its not optimised.
But without knowing more details what exactly is understood about optimising, this is the only thing I can think of.
If you are using a low number of images to compress, you might find an external service, such as: https://tinypng.com/developers might be of assistance.
I've used their on-line tools for reducing filesize on both JPG and PNG file manually but they do appaear to offer a free API service for the first 500 images per month.
Apologies if this would be better as a comment than an answer, I'm fairly new to stackoverflow and haven't got enough points yet, but felt this may be a handy alternative solution.
Saving a JPEG with the same or higher quality setting will not result in a noticeable loss in quality. Just re-save with your desired quality setting. If the file ends up larger, just discard it and keep the original. Remove metadata using jpegran or jpegoptim before you optimize so it doesn't affect the file size when you compare to the original.
PNG and GIF wont't lose any quality unless you reduce the number of colors. Just use one of the optimizers Gyncoca mentioned.
I've been doing some speed optimization on my site using Page Speed and it gives recommendations like so:
Optimizing the following images could reduce their size by 35.3KiB (21% reduction).
Losslessly compressing http://example.com/.../some_image.jpg could save 25.3KiB (55% reduction).
How are the calculations done to get the file size reduction numbers? How can I determine if an image is optimized in PHP?
As I understand it, they seem to base this on the image quality (so saving at 60% in Photoshop or so is considered optimized?).
What I would like to do is once an image is uploaded, check if the image is fully optimized, if not, then optimize it using a PHP image library such as GD or ImageMagick. If I'm right about the number being based on quality, then I will just reduce the quality as needed.
How can I determine if an image is fully optimized in the standards that Page Speed uses?
Chances are they are simply using a standard compression or working on some very simple rules to calculate image compression/quality. It isn't exactly what you were after but what I often use on uploaded images etc etc dynamic content is a class called SimpleImage:
http://www.white-hat-web-design.co.uk/blog/resizing-images-with-php/
this will give you the options to resize & adjust compression, and I think even change image type (by which I mean .jpg to .png or .gif anything you like). I worked in seo and page optimization was a huge part of my Job I generally tried to make the images the size the needed to be no smaller or bigger. compress JS & CSS and that's really all most people need to worry about.
It looks like you could use the PageSpeed Insights API to get the compressions scores: https://developers.google.com/speed/docs/insights/v1/reference, though I'm guessing you'd want to run a quality check/compression locally, rather than sending everything through this API.
It also looks like they've got a standalone image optimizer available at http://code.google.com/p/page-speed/wiki/DownloadPageSpeed?tm=2, though this appears to be a Windows binary. They've got an SDK there as well, though I haven't looked into what it entails, or how easy it would be to integrate into an existing site.
A certain site I know recently upgraded their bandwith from 2,5 TB monthly to 3,5 TB.
Reason is they went over the 2,5 limit recently. They're complaining they don't know how to get down the bandwidth usage.
One thing I haven't seen them consider is the fact that JPEG and other images that are displayed on the site(and it is an image-heavy site) can contain metadata. Where the picture was taken and such.
Fact of the matter is, this information is of no importance whatsoever on that site. It's not gonna be used, ever. Yet it's still adding to the bandwidth, since it increases the filesize of every images from a few bytes to a few kilobytes.
On a site that uses up more then 2,5 TB per month, stripping the several thousands images of their metadata will help decrease the bandwidth usage at least by a few Gigabytes per month I think, if not more.
So is there a way to do this in PHP? And also, for the allready existing files, does anybody know a good automatic metadata remover? I know of JPEG & PNG Stripper, but that's not very good... Might be usefull for initial cleaning though...
It's trivial with GD:
$img = imagecreatefromjpeg("myimg.jpg");
imagejpeg($img, "newimg.jpg", $quality);
imagedestroy($img);
This won't transfer EXIF data. Don't know how much bandwidth it will actually save, though, but you could use the code above to increase the compression of the images. That would save a lot of bandwidth, although it possibly won't be very popular.
I seriously doubt image metadata is the root of all evil here.
Some questions to take into consideration:
How is the webserver configured?
Does it issue http 304 responses properly?
Isn't there some kind of hand-made caching/streaming of data through php scripting that prevents said data from being cached by the browser? (in which case, url rewriting and http redirections should be considered).
Check out Smush.it! It will strip all un-necs info from an image. They have an API you can use to crunch the images.
Note: By Design, it may change the filetype on you. This is on purpose. If another filetype can display the same image with the same quality, with less bytes it will give you a new file.
I think you need to profile this. You might be right about it saving a few GB but thats relatively little on 2.5TB of bandwidth. You need real data about what is being served most and work on that. If you do find it is images that send your bandwidth usage so high you first should check your caching headers and 304 responses, you also might want to investigate using something like amazon S3 to serve your images. I have managed to reduce bandwidth costs a lot by doing this.
That said, if the EXIF data is really making that much of a difference then you can use the GD library to copy a jpeg image using the imagejpeg function. This won't copy EXIF data.
Emil H's probably addresses the question the best.
But I wanted to add that this will almost certainly not save you as much as you may think. This type of metadata takes up very little space; I would think that
Re-compressing the images to a smaller file size, and
Cropping or resizing to reduce the resolution of the images
are both going to have a much greater effect. With point one alone you could probably drop bandwidth 50% and with both, you could drop bandwidth 80% - that is if you are willing to sacrifice some image size.
If not, you could always have the default view at a smaller size, with an 'enlarge' link. Most people just browsing will see the smaller image, and only those who want the largest size will click to enlarge it, so you'll still get almost all the bandwidth saving. This is what Flickr does, for example.
Maybe some sort of hex data manipulation would help here. I'm facing the same problem and investigating on some sort of automated solution.
Just wondering if that can be done and if possible, I'll write a php class for this.
Might be smart to do all the image manipulation on the client side (using a java applet such as facebook does) and then when the image is compressed, resized and fully stripped of unnecessary pixels and content, it can be uploaded at it's optimal size, saving you bandwidth and server side performance! (at the cost of initial development)
I have a requirement to dynamically generate and compress large batches of PDF files.
I am considering the usual algorithms
Zip
Ace
Rar
Any other suggestion are welcome.
My question is which algorithm is likely to give me the smallest file size. Speed and efficency are also important factors but size is my primary concern.
Also does it make a difference whether I have many small files, or fewer larger files in each archive.
Most of my processing will be done in PHP, but I'm happy to interface with third party executables if needed.
Edit:
The documents are primarily invoices and shouldn't contain any other images except for the company logo
I have not had much success compressing PDFs. As pointed out, they are already compressed when composed (although some PDF composition tools allow you to specify a 'compression level'). If at all possible, the first approach you should take is to reduce the size of the composed PDFs.
If you keep the PDFs in a single file, they can share any common resources (images, fonts) and so can be significantly smaller. Note that this means one large PDF file, not one large ZIP with multiple PDFs inside.
In my experience it is quite difficult to compress the images within PDFs, and that images make by far the biggest impact on file size. Ensure that you have optimised images before you start. It is even worth running a test run without your images simply to see how much size the images are contributing.
The other component is font, and if you are using multiple embedded fonts then you are packing more data into the file. Just use one font to keep size down, or use fonts that are commonly installed so that you don't need to embed them.
I think 7z is the best currently with RAR being the second, but I would recommend you trying both to find out what works best for you.
LZMA is the best if you need smallest file size.
And of course PDF can be compressed itself.
I doubt you'll get much/any reduction in filesize by compressing PDFs. However, if all you're doing is collecting multiple files into one, why not tar it?
We've done this in the past for large (and many) PDFs that store lots of text - Training Packages for Training Organisations in Australia. Its about 96% text (course info etc) and a few small diagrams. Sizes vary from 1-2Mb to 8 or 9Mb and they usually come in volumes of 4 or more.
We've found compressing with Zip OK to get good compression as the PDF format is already heavily compressed, it was more of a ease of use for our users to download it all as a batch instead of worry about the filesizes. To give you an idea, a 2.31Mb file - lots of text, several full page diagrams - compressed to 1.92Mb in ZIP and 1.90Mb in RAR.
I'd recommend using LZMA to get the best - looking at resource usage on compressing and uncompressing too.
How big are these files? Get a copy of WinRAR, WinAce and 7Zip and give it ago.
Combine my nifty tool Precomp with 7-Zip. It decompresses the zLib streams inside the PDF so 7-Zip (or any other compressor) can handle them better. You will get filesizes about 50% of the original size lossless. This tool works especially well for PDF files, but is also nice for other compressed (zLib/LZW) streams as ZIP/GZip/JAR/GIF/PNG...
For result examples have a look here or here. Speed can be slow for the precompression (PDF->PCF) part, but will be very fast for the recompression/reconstruction (PCF->PDF) part.
For even better results than with Precomp + 7-Zip, you can try lprepaq and prepaq variants, but beware, especially prepaq is slooww :) - the bright side is that prepaq offers the best (PDF) compression currently available.