PHP GdImage has to process an image by functions like imagecreatefromjpeg and imagecreatefrompng.
Given an image of type jpeg or png with known dimensions:
- width in px,
- height in px,
- depth in bits (assume max is 32 bits, but could be more?)
and knowing:
- memory_limit = xxx; Maximum amount of memory a script may consume (in php.ini)
- memory_get_usage(); - amount of memory allocated to PHP
how we can determine if we have enough memory in order to process the image?
Obviously, the bytes occupied will be = width * height * depth/8.
But what is the available amount of memory?
Is it correct to write:
available_memory_in_bytes = memory_limit_in_bytes - memory_get_usage();
Can we use all of it to the last byte for imagecreatefrompng?
PHP is of decent version 8.1
Related
Opening a JPEG image using imagecreatefromjpeg can easily lead to fatal errors, because the memory needed exeeds the memory_limit.
A .jpg file that is less than 100Kb in size can easily exceed 2000x2000 pixels - which will take about 20-25MB of memory when opened. "The same" 2000x2000px image may take up 5MB on the disk using a different compression level.
So I obviously cannot use the filesize to determine if it can be opened safely.
How can I determine if a file will fit in memory before opening it, so I can avoid fatal errors?
According to several sources the memory needed is up to 5 bytes per pixel depending on a few different factors such as bit-depth. My own tests confirm this to be roughly true.
On top of that there is some overhead that needs to be accounted for.
But by examining the image dimensions - which can easily be done without loading the image - we can roughly estimate the memory needed and compare it with (an estimate of) the memory available like this:
$filename = 'black.jpg';
//Get image dimensions
$info = getimagesize($filename);
//Each pixel needs 5 bytes, and there will obviously be some overhead - In a
//real implementation I'd probably reserve at least 10B/px just in case.
$mem_needed = $info[0] * $info[1] * 6;
//Find out (roughly!) how much is available
// - this can easily be refined, but that's not really the point here
$mem_total = intval(str_replace(array('G', 'M', 'K'), array('000000000', '000000', '000'), ini_get('memory_limit')));
//Find current usage - AFAIK this is _not_ directly related to
//the memory_limit... but it's the best we have!
$mem_available = $mem_total - memory_get_usage();
if ($mem_needed > $mem_available) {
die('That image is too large!');
}
//Do your thing
$img = imagecreatefromjpeg('black.jpg');
This is only tested superficially, so I'd suggest further testing with a lot of different images and using these functions to check that the calculations are fairly correct in your specific environment:
//Set some low limit to make sure you will run out
ini_set('memory_limit', '10M');
//Use this to check the peak memory at different points during execution
$mem_1 = memory_get_peak_usage(true);
The image is less than 1MB but the size is roughly 5500x3600. I am trying to resize the image down too something less than 500x500. My code is pretty simple.
$image = imagecreatefromjpeg("upload/1.jpg");
$width = imagesx($image);
$height = imagesy($image);
$pWidth = 500;
$pHeight = 334;
$image_p = imagecreatetruecolor($pWidth, $pHeight);
setTransparency($image,$image_p,$ext);
imagecopyresampled($image_p, $image, 0, 0, 0, 0, $pWidth, $pHeight, $width, $height);
I found out that to process this image, the imagecreatefromjpeg uses 100M using memory_get_usage.
Is there a better way to do imagecreatefromjpeg? is there a workaround or a different function that uses less memory?
I did ask my server admins to increase the memory but I doubt they will increase it to 100M or more. I am considering limiting the dimensions of an image a user can upload but not before exhausting all my options as users will most likely upload images they took.
Btw, the following is the image i used which uses 100M of memory
What #dev-null-dweller says is the correct answer:
With 5500x3600 you will need at least 5500*3600*4 bytes in memory =~ 80MB. For large pictures Imagick extensions might have better performance than GD
There is no way to "improve" on that because that's the amount of memory needed to process the image. JPEG is a compressed format so its file size is irrelevant, it's the actual dimensions that count. The only way to deal with such an image inside GD is increasing the memory limit.
You may be able to do better using a library/command line client like ImageMagick if you have access to that - when you run ImageMagick from the command line, its memory usage won't count towards the memory limit. Whether you can do this, you'd need to find out from your web host or server admin.
Another idea that comes to mind is using an image resizing API that you send the image to. That would take the load completely off your server. This question has some pointers: https://stackoverflow.com/questions/5277571/is-there-a-cdn-which-provides-on-demand-image-resizing-cropping-sharpening-et
ImageMagick, on the command-line at least, is able to use a feature of libjpeg called "Shrink on Load" to avoid unnecessarily loading the whole image if you are only planning on scaling it down.
If I resize your image above up from its current 3412x2275 to the size you are actually dealing with, 5500x2600 like this
convert guys.jpg -resize 5500x2600 guys.jpg
I can now do some tests... first, a simple resize
/usr/bin/time -l convert guys.jpg -resize 500x334 small.jpg
0.85 real 0.79 user 0.05 sys
178245632 maximum resident set size <--- 178 MB
0 average shared memory size
0 average unshared data size
0 average unshared stack size
44048 page reclaims
0 page faults
0 swaps
you can see it uses a peak of 178 MB on my Mac.
If I now use the "Shrink on Load" feature I mentioned:
/usr/bin/time -l convert -define jpeg:size=500x334 guys.jpg -resize 500x334 small.jpg
0.06 real 0.04 user 0.00 sys
8450048 maximum resident set size <--- Only 8 MB
0 average shared memory size
0 average unshared data size
0 average unshared stack size
2381 page reclaims
33 page faults
0 swaps
you can see it only takes 8MB now. It's faster too!
I've been using vipsthumbnail for years. I use it to resize 700MP images (enormous 16,384 x 42,731 = 2.6BG if loaded into php) to a viewable size. This takes about 10 seconds to generate a small preview that's 3000px tall.
https://www.libvips.org/API/current/Using-vipsthumbnail.html
By specifying one dimension, it keeps the original aspect ratio and limits the x or y, which ever is greater
$exec_command = sprintf('vipsthumbnail --size 3000 "%s" -o "%s"', $source, $destination);
exec( $exec_command, $output, $return_var );
if($return_var != 0) {
// Handle Error Here
}
I was still using php to resize smaller images, but my system recently generated a secondary image that was 15,011 x 15,011 (1.075GB uncompressed). My PHP settings allow for 1GB of ram, and it was crashing!. I increased PHP's memory limit over time to deal with these images. I finally converted this function to also use vipsthumbnail. These smaller images only take about 100ms each to generate. Should have done this a long time ago.
$exec_command = sprintf('vipsthumbnail --size 150 "%s" -o "%s"', $src, $dst);
exec( $exec_command, $output, $return_var );
I allow users to upload images. However, I want to keep JPEG quality not more than 90%. What I plan to do is to detect the current quality:
- If less than 90% do nothing
- If more than 90%, than use Image Magick to recompress the image to 90%
Is it possible to do that? I prefer PHP but any language will help.
paddy is correct that this setting is not always stored in the JPEG file. If it is, then you can use identify from Imagemagick to read the quality. For example:
$ identify -format '%Q' tornado_ok.jpg
93%
Update: Based on the answer to this question
https://superuser.com/questions/62730/how-to-find-the-jpg-quality I
find out that apparently the identify command can still determine
the quality by reverse engineering the quantization tables even if all
the image EXIF / other meta data is lost. By the way, the title of
your question as it stands now is a possible duplicate of that
question I linked to.
But to me your question has merit on its own because in your
question's text you explain what you are trying to do, which is more
than simply detecting jpeg quality. Nevertheless, you should perhaps
update the title if you want to reflect that you are trying to solve a
more specific problem than just reading JPEG image quality.
Unless you are archiving original images, for web use even 90% is excessive. 75% used to be the default in the old days (degradation was visible only under close inspection between side-by-side images), and now in the days of high bandwidth 85% is a very high quality option. The 5% quality difference between 90% and 85% is virtually invisible, but will save you over 30% in file size typically. The JPEG algorithm is designed to begin by eliminating information that is invisible to human perception at its first compression stages (above 80% or so).
Update/note: The compression quality settings I am talking about are
from tests with libjpeg, a very widely used JPEG library. Photoshop's
compression percentages and other software's quality settings are all
independent and do not necessarily mean the same thing as the settings
of libjpeg.
paddy's idea of using image height and image width to calculate an acceptable file size is reasonable:
You can get the image height/width like this:
list($originalWidth, $originalHeight) = getimagesize($imageFile);
My own high-quality photos posted online, like this one: http://ksathletics.com/2013/wsumbb/nw.jpg
are typically saved at a ratio of about 200 KB per megapixel.
So, for example, you can multiply width times height and divide by 1000000 to calculate the megapixels in the image. Divide the file size by 1024 to calculate the KB. Then divide the resulting KB by the megapixels. If the result is under 200 or whatever value you decide upon, then you don't need to re-compress it. Otherwise, you can re-compress it with a quality of 85% or whatever quality you decide on.
Since the OP stated he prefers php, I offer the following:
$img = new Imagick($filename);
$quality = $img->getImageCompressionQuality();
whats up? I was facing the same problem with an app I'm developing... My problem is that, I extract several images from a random site and each item have several images, i would like to show a single image for each item, and bring to the user the best quality image.
I came out with this idea, its pretty simple, and will work for any language and any type of compression:
//Parameters you will need to retrieve from image
$width = 0;
$height = 0;
$filesize = 0;
//Quality answer for your image
$quality = (101-(($width*$height)*3)/$filesize);
I ran this algorithm against the http://fotoforensics.com/tutorial-estq.php mentioned above and here are the results:
Filename Width Height Pixels BitmapBytes FileBytes Quality
estq-baseline.png 400 300 120000 360000 163250 98,79
estq-90.jpg 400 300 120000 360000 34839 90,67
estq-80.jpg 400 300 120000 360000 24460 86,28
estq-70.jpg 400 300 120000 360000 19882 82,89
estq-25.jpg 400 300 120000 360000 10300 66,05
The basic idea behind this algorithm is compare the size that an image could reach if written in a bitmap way (without any compression, 3 bytes per pixel, 3 bytes for RGB) to the size that this image is currently using. The smaller is the image size, the higher is the compression, independently of the compression method used, rather its a JPG, a PNG or whatever, and that will lead us into having a bigger gap or smaller gap between uncompressed and compressed image.
It is also important to mention, that this is a mathematical solution for comparison purposes, this method will not return the actually quality of the image, it will answer the distance percentage between uncompressed and compressed sizes!
if you need more details, you can send me an email: rafaelkarst#gmail.com
You cannot guarantee that the quality setting is stored in the JPEG's metadata. This is an encoder setting, not an image attribute.
Read more here about estimating JPEG quality
It might make more sense to simply define a maximum file size. At the end of the day, restricting image quality is all about saving bandwidth. So setting a ratio between image dimensions and file size is more appropriate.
If your jpeg was created using a straight scaling of the standard image quantization tables and the 0-100 quality was used based on the Independent JPEG Group's formula, then, assuming you have the luminance quantization tables in an array called quantization (such as Python's PIL module provides in image.quantization[0]), then the original value can be obtained via:
if quantization[58] <= 100:
originalQuality = int(100 - quantization[58] / 2)
else:
originalQuality = int(5000.0 / 2.5 / quantization[15])
Basically, the default luminance quantization value #15 is 40 and #58 is 100, so these make handy values to extract the results from. IJG scales values about 50 via 5000 / Q and below 50 via 200 - 2 * Q. If the quality setting is less than 8, this won't give decent results (if quantization[5] == 255) -- in that case, perhaps use quantization table position #5
See https://www.rfc-editor.org/rfc/rfc2435#section-4.2.
For those who are using GraphicsMagick instead of ImageMagick, you can get the JPEG quality with the following command:
gm identify -format '%[JPEG-Quality]' path_to/image_file.jpg
and according to the documentation
http://www.graphicsmagick.org/GraphicsMagick.html#details-format
Please note that JPEG has no notion of "quality" and that the quality metric used by, and estimated by the software is based on the quality metric established by IJG JPEG 6b. Other encoders (e.g. that used by Adobe Photoshop) use different encoding metrics.
Here is a PHP function that tries all available methods of getting quality (that I know of):
/* Try to detect quality of jpeg.
If not possible, nothing is returned (null). Otherwise quality is returned (int)
*/
function detectQualityOfJpg($filename)
{
// Try Imagick extension
if (extension_loaded('imagick') && class_exists('Imagick')) {
$img = new Imagick($filename);
// The required function is available as from PECL imagick v2.2.2
if (method_exists($img, 'getImageCompressionQuality')) {
return $img->getImageCompressionQuality();
}
}
if (function_exists('shell_exec')) {
// Try Imagick
$quality = shell_exec("identify -format '%Q' " . $filename);
if ($quality) {
return intval($quality);
}
// Try GraphicsMagick
$quality = shell_exec("gm identify -format '%Q' " . $filename);
if ($quality) {
return intval($quality);
}
}
}
I have narrowed down my problem to show the size of the file matters. I don't know when the file is to big however. How can I catch the error. A 1.1mg file causes imagecreatetruecolor to bail but it chuggs along just fine when processing a 688k file.
Thanks
From what I can tell in the PHP/GD documentation, this function creates a 24-bit RGB image with black as the default color. The width and height it takes as arguments are ints for the pixel dimensions. Therefore to calculate size, you could multiply them as follows to determine if the raw image (before compression) is to blame:
1536 * 1962 = 3,013,632 pixels
3,013,632 * 24 = 72,327,168 bits
72,327,168 / 8 = 9,040,896 bytes
1024 * 768 = 786,432 pixels
786,432 * 24 = 18,874,368 bits
18,874,368 / 8 = 2,359,296 bytes
It seems unusual to me that this function would cause problems at a size of 1.1 MB but perhaps you are referring to a compressed image such as a jpg, where the actual raw size could be much, much greater. (As you can see a "small" image of 1024x768 is still well over 1.1 MB raw.)
The function will throw an error if you try to create an image too big. Just suppress the error and handle it yourself. For example,
$img = #imagecreatetruecolor(100000, 100000);
if ($img === false)
{
// Handle error
}
What is the maximum width and height that ImageCopyResampled can handle? My code works for smaller images in terms of width and height. For larger images, it will disregard the coordinates, meaning the cropping starts from the upper left corner of the image.
Is there a workaround? Here's a portion:
$trgt_width = 500;
$trgt_height = 400;
if(copy($src_file, $trgt_file)):
$src_image = imageCreateFromJpeg($src_file);
$trgt_image = imageCreateTrueColor($trgt_width, $trgt_height);
imageCopyResampled($trgt_image, $src_image, 0, 0, $x, $y, $trgt_width, $trgt_height, $width ,$height);
imageJpeg($trgt_image, $thumb_file, 75);
endif;
Thanks.
It depends on the maximum amount of RAM your scripts may occupy. This is usually set on your server by the administrator. The setting is called memory_limit
You can find it out using phpinfo() and searching for "memory_limit".
A rough calculation on the size needed to resize an image:
number of bytes width x number of bytes height x 3
3 for each channel of a true color image: Red, Green and blue.
So, an image 1000 x 1000 Pixels in size will take up at least 3 MB of memory. Probably more during the resize process, because the function will have to keep both the large and the resized version in memory at the same time.
In your case though, I would suspect that the image does not get cropped at all, probably because the copy operation fails because $src_file does not exist at all.