Optimizing Images via PHP - php

I would like to build a PHP script to optimize images similarly to how PunyPNG or Kraken.io optimizes images. Essentially, I would need to be able to take .jpeg, .png, and .gif images and reduce their file size as much as possible without losing quality (or with minimal quality loss).
These services offer APIs, but I would like to avoid unnecessary costs, and I do not want to be limited by a specific number of daily uses.
Can this be done with something like ImageMagick? Is it even possible, or is it far too complicated?

talking about re-sizing images, they were never an issue, there are couple of tools that help you do that in bulk. Since you specifically say PHP, I am expecting you would be using it for displaying it on a page. for the very same purpose I wrote this little script not so long ago, which might be of some help to you. Fork it here https://github.com/whizzzkid/phpimageresize

Spatie has a decent package that gets updated regularly, I've been using for a while without problems:
https://github.com/spatie/image-optimizer

Related

Are there image processing libraries for PHP that'd allow manipulating images on a pixel by pixel basis like MATLAB?

I'm working on a project that might require me to compute the DCT of the image. Hence the question.
-> Is there a fast way of pixel-by-pixel proceccing images in PHP
I think it would be best to process the files using another language instead of PHP or library/application such as - ImageMagick php.net/manual/en/book.imagick.php.
I've never used this myself, so don't really know performance issues.
Also -> wideimage.sourceforge.net/
The GD functions, as described in the previous answer, are quite slow, and sometimes can take a huge chunk of the CPU for processing.
Take a look here for seperate benchmarks etc:
http://kore-nordmann.de/blog/comparison_of_php_image_libraries_update.html
GD has per-pixel operations, but they're not particularly fast: imagecolorat() and imagesetpixel()

Managing user file uploads - Avoiding ghosts... - CMS/PHP/MySQL

i've thinking about various ways of handling file uploads in a sort of CMS. I write here because i am not satisfied with what i've got right now...
The problem
Uhmm, lets call it the tumblr way ;-) The user shall be able to upload a file or several files directly without a file management view or s.th like that. The bottom side is that if he deletes the file in the WYSIWYG editor the file stays on the server. In my case there is not only a WYSIWYG editor also a media module...
The question
Is there a best practice for handling this? I've never programmed s.th like that. Would you store the filenames in a MySQL table, would you use a cron job to check if the files are really used in the document?
ANY ADVICE WOULD BE REALLY APRECIATED!!!
Muchissimas Gracias y Saludos!!!!
Personally I use a cron job that runs once a day and cleans up any orphan uploaded files (orphans older that x days).
I admit, I'm curious about other ppl approaches.
Why so much hassle for some additional space used? Hard drive space shouldn't be a concern, since it's so cheap. And even if it were not, the images are very lightweight resources.
The only problems I can imagine is that your CMS's users are uploading very large files. In that case, you should process the image before saving them, lowering the quality and the size.
I think a cron job would be more CPU intensive than letting some 'ghosts' files.
However, you could try to catch when an image is deleted, but then again, it could be more trouble.

I want to create multiple thumbnails using GD library in php, which is better creating on the fly or creating physical one?

I want to create multiple thumbnails using GD library in php, and I already have a script to do this, the question is what is better for me .. is it better to create thumbnail on the fly? or create a physical file on my server each time I want a thumb?? and Why?
Please, consider time consuming and storage capacity and other disadvantages for both
When you create the thumbnail depends on a couple of factors (that I'll get into) but you should never discard the output of something like this (unless you'll never use it again) as it's a really expensive operation.
Anyway your two main choices for "when to generate the thumbnail" are:
When it's first requested. This is common and it means that you don't generate thumbnails that are never used but it does mean if you have a page full of first-time-thumbnails that the server might become overwhelmed with PHP processes generating the thumbnails.
I had a similar issue with Sorl+Django where I was generating 100+ thumbnails per request for the first few requests after uploading and it basically made the entire server hang for 20 minutes. Not good.
Generate all required thumbnails when you upload. Because it takes a long time to upload, you break down the processing quite a lot. You can also pull it out-of-process (ie use another script to process uploads - perhaps not even in PHP).
The obvious downside is you're using up disk space that you otherwise might not need to use up... But unless you're talking about hundreds of thousands of thumbnails, a small percentage of unused ones probably won't break the bank.
Of course, if disk space is an issue, there might be an argument for pushing the thumbnail up to a CDN at the same time as you process it.
One note when you save the thumbnails, it's fairly common that you'll want to resize the thumbnails at some point down the line or perhaps want two small variants. I find it really useful to make the filenames very specific so if the original image is image.jpg, the 200x200 version is image-200x200.jpg.
Neither/both - don't generate the thumbnails till you need them - but keep the files you generate.
That way you'll minimise the amount of work needed and have a self-repairing system
C.
GD is really resource heavy, so you should look at if you can use ImageMagick instead (which also has a clearer syntax).
You definitely will be better off caching the created thumbnail after the first run (regardless of if you run GD or ImageMagick) and serve them from the cache. If you are worried about storage, clear out old files from the cache now and then.
Always cache (= write out to disk) the results of GD operations. They are too expensive both regarding processor time and memory to be done on the fly every time. This becomes increasingly true the more visitors/hits you have.

How can I reduce time to generate PDF dynamically using PHP?

I am generating PDF dynamically using html2ps PHP library. I want to decrease time of generating that PDF .I want to reduce that PDF generation time .Is there any way to reduce the time or optimize it?
Help me please.
Are you running this code on recent hardware?
While it may sound like avoiding the solution, on the Coding Horror Blog (whom happens to be the guys who made this site) they preach that you shouldn't spend time tweaking performance if your hardware is limited.
If you're doing this on a single core CPU (ex: Pentium 4), you are wasting your time worrying about what library to use or what code to change. Even the slowest Core 2 Duos and newer AMDs start at 2x faster than the best Pentium 4.
PS: I wasn't able to find the article on their site to link for you.
PPS: Most Pentium 4 motherboards support the 65nm Core 2 Duos.
One method of optimizing is to try another library. I use dompdf when I need to convert HTML to PDF, and I haven't found any need for optimizing, it's very fast, supports CSS properly and produces accurate results.
Reduce the complexity of the output.
Reduce the output quantity.
If the PDF generation is impacting other operations, delegate it to another process or server.
I'm sure you can avoid a lot of processing by skipping the HTML/CSS step(s) and go directly to PDF. Check FPDF or PDFLib

Having an image stripped of metadata upon upload in PHP

A certain site I know recently upgraded their bandwith from 2,5 TB monthly to 3,5 TB.
Reason is they went over the 2,5 limit recently. They're complaining they don't know how to get down the bandwidth usage.
One thing I haven't seen them consider is the fact that JPEG and other images that are displayed on the site(and it is an image-heavy site) can contain metadata. Where the picture was taken and such.
Fact of the matter is, this information is of no importance whatsoever on that site. It's not gonna be used, ever. Yet it's still adding to the bandwidth, since it increases the filesize of every images from a few bytes to a few kilobytes.
On a site that uses up more then 2,5 TB per month, stripping the several thousands images of their metadata will help decrease the bandwidth usage at least by a few Gigabytes per month I think, if not more.
So is there a way to do this in PHP? And also, for the allready existing files, does anybody know a good automatic metadata remover? I know of JPEG & PNG Stripper, but that's not very good... Might be usefull for initial cleaning though...
It's trivial with GD:
$img = imagecreatefromjpeg("myimg.jpg");
imagejpeg($img, "newimg.jpg", $quality);
imagedestroy($img);
This won't transfer EXIF data. Don't know how much bandwidth it will actually save, though, but you could use the code above to increase the compression of the images. That would save a lot of bandwidth, although it possibly won't be very popular.
I seriously doubt image metadata is the root of all evil here.
Some questions to take into consideration:
How is the webserver configured?
Does it issue http 304 responses properly?
Isn't there some kind of hand-made caching/streaming of data through php scripting that prevents said data from being cached by the browser? (in which case, url rewriting and http redirections should be considered).
Check out Smush.it! It will strip all un-necs info from an image. They have an API you can use to crunch the images.
Note: By Design, it may change the filetype on you. This is on purpose. If another filetype can display the same image with the same quality, with less bytes it will give you a new file.
I think you need to profile this. You might be right about it saving a few GB but thats relatively little on 2.5TB of bandwidth. You need real data about what is being served most and work on that. If you do find it is images that send your bandwidth usage so high you first should check your caching headers and 304 responses, you also might want to investigate using something like amazon S3 to serve your images. I have managed to reduce bandwidth costs a lot by doing this.
That said, if the EXIF data is really making that much of a difference then you can use the GD library to copy a jpeg image using the imagejpeg function. This won't copy EXIF data.
Emil H's probably addresses the question the best.
But I wanted to add that this will almost certainly not save you as much as you may think. This type of metadata takes up very little space; I would think that
Re-compressing the images to a smaller file size, and
Cropping or resizing to reduce the resolution of the images
are both going to have a much greater effect. With point one alone you could probably drop bandwidth 50% and with both, you could drop bandwidth 80% - that is if you are willing to sacrifice some image size.
If not, you could always have the default view at a smaller size, with an 'enlarge' link. Most people just browsing will see the smaller image, and only those who want the largest size will click to enlarge it, so you'll still get almost all the bandwidth saving. This is what Flickr does, for example.
Maybe some sort of hex data manipulation would help here. I'm facing the same problem and investigating on some sort of automated solution.
Just wondering if that can be done and if possible, I'll write a php class for this.
Might be smart to do all the image manipulation on the client side (using a java applet such as facebook does) and then when the image is compressed, resized and fully stripped of unnecessary pixels and content, it can be uploaded at it's optimal size, saving you bandwidth and server side performance! (at the cost of initial development)

Categories