I have following task: I have a big image 3000x2000 also image use all 4 channels so this is RGBA and I need to process each pixels in some way.
My first approach was using standard way like:
$this->image->getImagePixelColor($x, $y)->getColor()
but I saw that it is very slow, like 45 seconds. The reason is because in such approach for each pixel will be created a class and it will take some time.
Second approach that I think is export pixels as array, so I try next:
$lines = $this->image->exportImagePixels(0, 0, $width, $height, "RGBA", Imagick::PIXEL_CHAR);
But I saw that page is crashed without any error. After analyzing I saw that it is use a lot of memory. Before calling exportImagePixels it use around 2 mb of RAM, after using export it use 258 mb of RAM(parameter $width=3000, $heignt = 500). I suppose that it should be around 6 mb for this parameters. For example for exported image with size 300x300 it use 18 mb that is a lot.
For measuring memory I use following function:
function printMemoryUsage() {
$mem_usage = memory_get_usage(true);
if ($mem_usage < 1024)
echo $mem_usage." bytes";
elseif ($mem_usage < 1048576)
echo round($mem_usage/1024,2)." kilobytes";
else
echo round($mem_usage/1048576,2)." megabytes";
}
Do you have any idea why it use such amount of memories, and how I could just get a pixel array from Imagick image?
PHP GdImage has to process an image by functions like imagecreatefromjpeg and imagecreatefrompng.
Given an image of type jpeg or png with known dimensions:
- width in px,
- height in px,
- depth in bits (assume max is 32 bits, but could be more?)
and knowing:
- memory_limit = xxx; Maximum amount of memory a script may consume (in php.ini)
- memory_get_usage(); - amount of memory allocated to PHP
how we can determine if we have enough memory in order to process the image?
Obviously, the bytes occupied will be = width * height * depth/8.
But what is the available amount of memory?
Is it correct to write:
available_memory_in_bytes = memory_limit_in_bytes - memory_get_usage();
Can we use all of it to the last byte for imagecreatefrompng?
PHP is of decent version 8.1
Opening a JPEG image using imagecreatefromjpeg can easily lead to fatal errors, because the memory needed exeeds the memory_limit.
A .jpg file that is less than 100Kb in size can easily exceed 2000x2000 pixels - which will take about 20-25MB of memory when opened. "The same" 2000x2000px image may take up 5MB on the disk using a different compression level.
So I obviously cannot use the filesize to determine if it can be opened safely.
How can I determine if a file will fit in memory before opening it, so I can avoid fatal errors?
According to several sources the memory needed is up to 5 bytes per pixel depending on a few different factors such as bit-depth. My own tests confirm this to be roughly true.
On top of that there is some overhead that needs to be accounted for.
But by examining the image dimensions - which can easily be done without loading the image - we can roughly estimate the memory needed and compare it with (an estimate of) the memory available like this:
$filename = 'black.jpg';
//Get image dimensions
$info = getimagesize($filename);
//Each pixel needs 5 bytes, and there will obviously be some overhead - In a
//real implementation I'd probably reserve at least 10B/px just in case.
$mem_needed = $info[0] * $info[1] * 6;
//Find out (roughly!) how much is available
// - this can easily be refined, but that's not really the point here
$mem_total = intval(str_replace(array('G', 'M', 'K'), array('000000000', '000000', '000'), ini_get('memory_limit')));
//Find current usage - AFAIK this is _not_ directly related to
//the memory_limit... but it's the best we have!
$mem_available = $mem_total - memory_get_usage();
if ($mem_needed > $mem_available) {
die('That image is too large!');
}
//Do your thing
$img = imagecreatefromjpeg('black.jpg');
This is only tested superficially, so I'd suggest further testing with a lot of different images and using these functions to check that the calculations are fairly correct in your specific environment:
//Set some low limit to make sure you will run out
ini_set('memory_limit', '10M');
//Use this to check the peak memory at different points during execution
$mem_1 = memory_get_peak_usage(true);
I allow users to upload images. However, I want to keep JPEG quality not more than 90%. What I plan to do is to detect the current quality:
- If less than 90% do nothing
- If more than 90%, than use Image Magick to recompress the image to 90%
Is it possible to do that? I prefer PHP but any language will help.
paddy is correct that this setting is not always stored in the JPEG file. If it is, then you can use identify from Imagemagick to read the quality. For example:
$ identify -format '%Q' tornado_ok.jpg
93%
Update: Based on the answer to this question
https://superuser.com/questions/62730/how-to-find-the-jpg-quality I
find out that apparently the identify command can still determine
the quality by reverse engineering the quantization tables even if all
the image EXIF / other meta data is lost. By the way, the title of
your question as it stands now is a possible duplicate of that
question I linked to.
But to me your question has merit on its own because in your
question's text you explain what you are trying to do, which is more
than simply detecting jpeg quality. Nevertheless, you should perhaps
update the title if you want to reflect that you are trying to solve a
more specific problem than just reading JPEG image quality.
Unless you are archiving original images, for web use even 90% is excessive. 75% used to be the default in the old days (degradation was visible only under close inspection between side-by-side images), and now in the days of high bandwidth 85% is a very high quality option. The 5% quality difference between 90% and 85% is virtually invisible, but will save you over 30% in file size typically. The JPEG algorithm is designed to begin by eliminating information that is invisible to human perception at its first compression stages (above 80% or so).
Update/note: The compression quality settings I am talking about are
from tests with libjpeg, a very widely used JPEG library. Photoshop's
compression percentages and other software's quality settings are all
independent and do not necessarily mean the same thing as the settings
of libjpeg.
paddy's idea of using image height and image width to calculate an acceptable file size is reasonable:
You can get the image height/width like this:
list($originalWidth, $originalHeight) = getimagesize($imageFile);
My own high-quality photos posted online, like this one: http://ksathletics.com/2013/wsumbb/nw.jpg
are typically saved at a ratio of about 200 KB per megapixel.
So, for example, you can multiply width times height and divide by 1000000 to calculate the megapixels in the image. Divide the file size by 1024 to calculate the KB. Then divide the resulting KB by the megapixels. If the result is under 200 or whatever value you decide upon, then you don't need to re-compress it. Otherwise, you can re-compress it with a quality of 85% or whatever quality you decide on.
Since the OP stated he prefers php, I offer the following:
$img = new Imagick($filename);
$quality = $img->getImageCompressionQuality();
whats up? I was facing the same problem with an app I'm developing... My problem is that, I extract several images from a random site and each item have several images, i would like to show a single image for each item, and bring to the user the best quality image.
I came out with this idea, its pretty simple, and will work for any language and any type of compression:
//Parameters you will need to retrieve from image
$width = 0;
$height = 0;
$filesize = 0;
//Quality answer for your image
$quality = (101-(($width*$height)*3)/$filesize);
I ran this algorithm against the http://fotoforensics.com/tutorial-estq.php mentioned above and here are the results:
Filename Width Height Pixels BitmapBytes FileBytes Quality
estq-baseline.png 400 300 120000 360000 163250 98,79
estq-90.jpg 400 300 120000 360000 34839 90,67
estq-80.jpg 400 300 120000 360000 24460 86,28
estq-70.jpg 400 300 120000 360000 19882 82,89
estq-25.jpg 400 300 120000 360000 10300 66,05
The basic idea behind this algorithm is compare the size that an image could reach if written in a bitmap way (without any compression, 3 bytes per pixel, 3 bytes for RGB) to the size that this image is currently using. The smaller is the image size, the higher is the compression, independently of the compression method used, rather its a JPG, a PNG or whatever, and that will lead us into having a bigger gap or smaller gap between uncompressed and compressed image.
It is also important to mention, that this is a mathematical solution for comparison purposes, this method will not return the actually quality of the image, it will answer the distance percentage between uncompressed and compressed sizes!
if you need more details, you can send me an email: rafaelkarst#gmail.com
You cannot guarantee that the quality setting is stored in the JPEG's metadata. This is an encoder setting, not an image attribute.
Read more here about estimating JPEG quality
It might make more sense to simply define a maximum file size. At the end of the day, restricting image quality is all about saving bandwidth. So setting a ratio between image dimensions and file size is more appropriate.
If your jpeg was created using a straight scaling of the standard image quantization tables and the 0-100 quality was used based on the Independent JPEG Group's formula, then, assuming you have the luminance quantization tables in an array called quantization (such as Python's PIL module provides in image.quantization[0]), then the original value can be obtained via:
if quantization[58] <= 100:
originalQuality = int(100 - quantization[58] / 2)
else:
originalQuality = int(5000.0 / 2.5 / quantization[15])
Basically, the default luminance quantization value #15 is 40 and #58 is 100, so these make handy values to extract the results from. IJG scales values about 50 via 5000 / Q and below 50 via 200 - 2 * Q. If the quality setting is less than 8, this won't give decent results (if quantization[5] == 255) -- in that case, perhaps use quantization table position #5
See https://www.rfc-editor.org/rfc/rfc2435#section-4.2.
For those who are using GraphicsMagick instead of ImageMagick, you can get the JPEG quality with the following command:
gm identify -format '%[JPEG-Quality]' path_to/image_file.jpg
and according to the documentation
http://www.graphicsmagick.org/GraphicsMagick.html#details-format
Please note that JPEG has no notion of "quality" and that the quality metric used by, and estimated by the software is based on the quality metric established by IJG JPEG 6b. Other encoders (e.g. that used by Adobe Photoshop) use different encoding metrics.
Here is a PHP function that tries all available methods of getting quality (that I know of):
/* Try to detect quality of jpeg.
If not possible, nothing is returned (null). Otherwise quality is returned (int)
*/
function detectQualityOfJpg($filename)
{
// Try Imagick extension
if (extension_loaded('imagick') && class_exists('Imagick')) {
$img = new Imagick($filename);
// The required function is available as from PECL imagick v2.2.2
if (method_exists($img, 'getImageCompressionQuality')) {
return $img->getImageCompressionQuality();
}
}
if (function_exists('shell_exec')) {
// Try Imagick
$quality = shell_exec("identify -format '%Q' " . $filename);
if ($quality) {
return intval($quality);
}
// Try GraphicsMagick
$quality = shell_exec("gm identify -format '%Q' " . $filename);
if ($quality) {
return intval($quality);
}
}
}
What is the maximum width and height that ImageCopyResampled can handle? My code works for smaller images in terms of width and height. For larger images, it will disregard the coordinates, meaning the cropping starts from the upper left corner of the image.
Is there a workaround? Here's a portion:
$trgt_width = 500;
$trgt_height = 400;
if(copy($src_file, $trgt_file)):
$src_image = imageCreateFromJpeg($src_file);
$trgt_image = imageCreateTrueColor($trgt_width, $trgt_height);
imageCopyResampled($trgt_image, $src_image, 0, 0, $x, $y, $trgt_width, $trgt_height, $width ,$height);
imageJpeg($trgt_image, $thumb_file, 75);
endif;
Thanks.
It depends on the maximum amount of RAM your scripts may occupy. This is usually set on your server by the administrator. The setting is called memory_limit
You can find it out using phpinfo() and searching for "memory_limit".
A rough calculation on the size needed to resize an image:
number of bytes width x number of bytes height x 3
3 for each channel of a true color image: Red, Green and blue.
So, an image 1000 x 1000 Pixels in size will take up at least 3 MB of memory. Probably more during the resize process, because the function will have to keep both the large and the resized version in memory at the same time.
In your case though, I would suspect that the image does not get cropped at all, probably because the copy operation fails because $src_file does not exist at all.