How to prevent image bombs with ImageMagick? - php
I currently use Imagick library on PHP and use the resizing functionality of Image Magick. I've just learned about decompression bombs and how ImageMagick is vulnerable to it.
I have checked how we can ping the image and verify the dimensions of the image without actually loading it into memory/disk. It's also safer to limit the memory and disk limits of ImageMagick so it wouldn't just write a huge file on disk.
I've read and I can do that with setResourceLimit().
http://php.net/manual/en/imagick.setresourcelimit.php
IMagick::setResourceLimit(IMagick::RESOURCETYPE_MEMORY , 100);
IMagick::setResourceLimit(IMagick::RESOURCETYPE_DISK , 100);
$thumb = new Imagick('image.png');
$thumb->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
However, what happens is that after setting the limit of disk and memory, if an image does hit this limit, all I get is a segmentation fault error, no exceptions are thrown. This makes it impossible for me to handle it properly.
Update:
Here are the package versions I'm using:
dpkg -l | grep magick
ii imagemagick-common 8:6.6.9.7-5ubuntu3.3 image manipulation programs -- infrastructure
ii libmagickcore4 8:6.6.9.7-5ubuntu3.3 low-level image manipulation library
ii libmagickwand4 8:6.6.9.7-5ubuntu3.3 image manipulation library
ii php5-imagick 3.1.0~rc1-1 ImageMagick module for php5
Setting the 'Resource Area' limit only sets the size at which images are not held in memory, and instead are paged to disk. If you want to use that setting to actually restrict the maximum size image that can be openend, you also need to set the 'Resource Disk' limit.
The code below correctly gives a memory allocation error for the image bombs taken from here.
try {
Imagick::setResourceLimit(Imagick::RESOURCETYPE_AREA, 2000 * 2000);
Imagick::setResourceLimit(Imagick::RESOURCETYPE_DISK, 2000 * 2000);
$imagick = new Imagick("./picture-100M-6000x6000.png");
$imagick->modulateImage(100, 50, 120);
$imagick->writeImage("./output.png");
echo "Complete";
}
catch(\Exception $e) {
echo "Exception: ".$e->getMessage()."\n";
}
Output is:
Exception: Memory allocation failed `./picture-100M-6000x6000.png' # error/png.c/MagickPNGErrorHandler/1630
If you want to set the width and height resource, and have a version of ImageMagick >= 6.9.0-1 you should be able to using the values directly of WidthResource = 9, HeightResource = 10
//Set max image width of 2000
Imagick::setResourceLimit(9, 2000);
//Set max image height of 1000
Imagick::setResourceLimit(10, 1000);
These don't have to be set programmatically, you can set them through the policy.xml file installed with ImageMagick. ImageMagick reads that file and uses those settings if none are set in a program - which may be a more convenient way of setting them, as you can change them per machine.
This makes it impossible for me to handle it properly.
It makes it impossible for you to handle it in the same process. You can handle it just fine by running the image processing in a background task.
Personally I think anyway that uses Imagick in a server that is accessed directly by web-browsers is nuts. It is far safer to run it in as a background task (managed by something like http://supervisord.org/) and communicating with that background task via a queue of jobs that need to be processed.
Not only does that solve the 'bad images can bring down my website' problem, it also makes it far easier to monitor resource usage, or shift the image processing to a machine with a faster CPU than a web-front end server needs.
Source - I'm the maintainer of the Imagick extension and I recently added this to the Imagick readme:
Security
The PHP extension Imagick works by calling the ImageMagick library.
Although the ImageMagick developers take good care in avoiding bugs it
is inevitable that some bugs will be present in the code. ImageMagick
also uses a lot of third party libraries to open, read and manipulate
files. The writers of these libraries also take care when writing
their code. However everyone makes mistakes and there will inevitably
be some bugs present.
Because ImageMagick is used to process images it is feasibly possible
for hackers to create images that contain invalid data to attempt to
exploit these bugs. Because of this we recommend the following:
1) Do not run Imagick in a server that is directly accessible from
outside your network. It is better to either use it as a background
task using something like SupervisorD or to run it in a separate
server that is not directly access on the internet.
Doing this will make it difficult for hackers to exploit a bug, even
if one should exist in the libraries that ImageMagick is using.
2) Run it as a very low privileged process. As much as possible the
files and system resources accessible to the PHP script that Imagick
is being called from should be locked down.
3) Check the result of the image processing is a valid image file
before displaying it to the user. In the extremely unlikely event that
a hacker is able to pipe arbitrary files to the output of Imagick,
checking that it is an image file, and not the source code of your
application that is being sent, is a sensible precaution.
Starting with ImageMagick-6.9.0-1, "width" and "height"
resource limits were added. From the commandline, use "-limit width 32000", etc. ImageMagick's PNG decoder will bail out without decompressing the image if the width or height exceeds the specified limit.
The PNG decoder will not attempt to decompress images whose width or height exceeds the limits.
The "area" resource is available in earlier versions of ImageMagick (and Imagick); however the PNG decoder does not reject images based on the "area" limit (see Danack's comment).
In versions of ImageMagick earlier than 6.9.0, the width and height limits come from libpng, and depend upon the libpng version. Current libpng versions (1.0.16 and later, 1.2.6 and later, 1.5.22 and later, and 1.6.17 and later) impose 1,000,000-column and width limits. In versions 1.2.0 through 1.2.5, 1.5.0 through 1.5.23, and 1.6.0 through 1.6.16, the limits were 2.7 billion rows and columns by default.
Look for RESOURCETYPE_AREA in Imagick (I don't see _WIDTH or _HEIGHT in the manual that you referenced, so either Imagick or its manual needs to be updated). So try
IMagick::setResourceLimit(IMagick::RESOURCETYPE_AREA , 100M);
to set a 100MegaPixel limit. Hopefully, some future version of Imagick will support RESOURCETYPE_WIDTH and RESOURCETYPE_HEIGHT, to provide a better solution to the decompression-bomb vulnerability. See Danack's answer about setting these with the current version of IMagick.
all I get is a segmentation fault error, no exceptions are thrown
I'm guessing your segment fault is from resources being set too low for ImageMagick (and related delegates) to operate. The value of the resource are in bytes, not megabytes.
Imagick does throw an exception if a resource is reached. Usually something like...
"cache resource exhausted"
Decompression bombs, or Zip-Bombs, are extremely difficult to identify. What your doing by ping-ing the image, and setting resource limits is the correct course of action. I would roughly outline a solution as...
// Define limits in application settings, or bootstrap (not dynamically!)
define('MY_MAGICK_MEMORY_LIMIT', 5e+8);
// Repeat for AREA, DISK, & etc.
// In application
$image = new Imagick(); // allocate IM structrues
// Set limits on instance
$image->setResourceLimit(Imagick::RESOURCETYPE_MEMORY, MY_MEMORY_LIMIT);
// Repeat for RESOURCETYPE_AREA, RESOURCETYPE_DISK, & etc.
$filename = 'input.png';
if($image->ping($filename)) {
// Validate that this image is what your expecting
// ...
try {
$image->read($filename); // <-- Bomb will explode here
$image->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
} catch( ImageickException $err ) {
// Handle error
}
}
unset($image)
If you don't trust decompression, you can leverage Imagick::getImageCompression during ping validation to inspect which compression an image requires. The compression type would be a integer that would map to the following enum...
typedef enum
{
UndefinedCompression,
B44ACompression,
B44Compression,
BZipCompression,
DXT1Compression,
DXT3Compression,
DXT5Compression,
FaxCompression,
Group4Compression,
JBIG1Compression,
JBIG2Compression,
JPEG2000Compression,
JPEGCompression,
LosslessJPEGCompression,
LZMACompression,
LZWCompression,
NoCompression,
PizCompression,
Pxr24Compression,
RLECompression,
ZipCompression,
ZipSCompression
} CompressionType;
MagickStudio (written in PERL) offers a good starting point for default resource limits, and how they are checked against uploaded images (search for Ping.)
Related
GD fails to create JPG
I have an issue with GD not creating a new JPG file, it just fails. No error messages and no indication as to what is happening. This is not new code, it has been in and working for the past six years, but all of a sudden with larger images it has started failing. As a background this is running on an old server (to be switched off and moved to a new site on PHP8 in a couple of months time) that has PHP5.3.3 with GD version 2.0.34. The code is creating thumbnails from the high-res image (around 24-30MB) and outputting a series of thumbnails from 150px wide to 1024px wide. It fails on all. I have increased the PHP memory limit on the page to 512MB, and set the GD.JPEG_ignore_warning flag for corrupt JPGs. But every time with these files, this line: $src_img = #imagecreatefromjpeg($file_path); just returns FALSE. But never falls over with an error. The file is definitely there (run the same code with a file of the same name that is <20MB and it works fine) and there is plenty of disc space/memory available for a 60MB file to be processed in memory, so I dont see that that is the issue. A search of Google, StackOverflow and several other sites has not produced any similar issues. Can anyone offer any thoughts as to what the issue/solution is? I have been looking at this for two days now, and we need to resolve it - simply using smaller JPG files isn't an option for this.
Thanks to #Lessmore answer above, GD was reporting an invalid JPG file, so a further search revealed this answer on StackOverflow, which solved the problem by reading the JPG from a string, rather than file: Reading an invalid JPG with GD Thanks all - as ever!
How to override "Suggested maximum size is 2500 pixels" on media upload page - wordpress
My client needs to upload high-res images for her online press kit. She is getting this error: "Post-processing of the image failed likely because the server is busy or does not have enough resources. Uploading a smaller image may help. Suggested maximum size is 2500 pixels." The images she wants to upload are about 2.5MB in size, and are 4272 x 2848 with 72dpi. If I crop the images, to be 2500x1667 at 72dpi, they upload fine (meeting the 2500 pixel suggested max size.) Is there a way to allow the larger pixel images as indicated above (4272 x 2848)? I am not sure which php setting is the issue - I think it might be memory size, but if it is, I am not sure where to change it or what amount to set it to, to allow twice the pixel max size allowance (going from 2500 to say 5000 pixels)... or if that is even allowed. Any help would be appreciated. Here are my system details: WordPress Version: 5.5.1 MySQL Version: 5.6.41 BootStrap Version: 3.3.1 PHP version 7.3.22 (Supports 64bit values) PHP max input variables 1000 PHP time limit 30 PHP memory limit 256M Max input time 60 Upload max filesize 256M PHP post max size 260M Thanks!
I ran into this problem. Disabling big_image_size_threshold didn't fix it. I think my issue is that after upgrading to PHP 7.4, the version of ImageMagick running on my host for PHP 7.4 is bad or something. I fixed the issue by using GD instead of ImageMagick. Just add this to functions.php: add_filter('wp_image_editors', function($editors) { return ['WP_Image_Editor_GD', 'WP_Image_Editor_Imagick']; }); One thing to note: If you don't have GD installed, WP will default back to using Imagick. So there's little risk in making this change. Though if it doesn't resolve the problem, you might want to check that GD is actually installed.
You can use the big_image_size_threshold filter to change or disable this behavior. https://developer.wordpress.org/reference/hooks/big_image_size_threshold/ If the original image width or height is above the threshold, it will be scaled down. The threshold is used as max width and max height. The scaled down image will be used as the largest available size, including the _wp_attached_file post meta value. Returning false from the filter callback will disable the scaling.
This verified answer didn't work for me The apache2 errors should be always in your /var/log/apache2/error.log So, you can identify the problem easily. I fixed it by myself: sudo apt-get install php-mbstring sudo service apache2 restart See this post there are many solutions for this: Link
I encountered this same issue and struggled with it for nearly a whole day - trying answers from this page and from this WordPress topic also: https://wordpress.org/support/topic/post-processing-of-the-image-failed-error/ Eventually, what resolved the problem for me was that I went to the WordPress Updates page and simply Re-installed WordPress - everything has been perfectly fine since. I am running a network on WP version 5.7 with php 7.3
Main case : if you using any plugin in WordPress For image compress then its happens in my case i was using WP Compress in WordPress i get most of this error when uploading image , after all i deactivate this plugin & problem fixed .
PHP Imagick Segementation Fault
I have a cheap $5/mo server, 1G ram processing some images for my website. Very rarely I will encounter a segmentation fault with PHP Imagick when writing a GIF image to disk. I have set a memory limit on the console command hoping PHP would catch the issue first and throw an exception that I can properly handle, but this did not work. The particular issue is certain GIF images cause it to crash at this line of code: echo 'Writing images to disk.' . PHP_EOL; $file = $img->writeImages($imageOnDIsk, true);//crashes here echo 'Finished writing images to disk.' . PHP_EOL; The specific GIF is an adult related GIF so I am not sure if I can share it. Here is my server logs: Setting memory limit. Pulling URL: https://i.redd.it/gyvc8t9xdvb41.gif Coalescing Images Done Coalescing Images Processing regular image. Not comic. Deleting temp image on disk. Writing images to disk. Segmentation fault (core dumped) Seems this images width * height * bit depth * frames is using up more than 1G of memory. Need to detect this before hand.
A Segmentation fault is a generic operating system level error. As an end-user of PHP you neither have any control over it nor can you fix it. It is something that needs to be fixed at the php-src level (in this case specifically the imagick extension code). Simply put this potentially a bug in PHP itself not in your code. Things you can do is file a bug report with PHP. And provide a backtrace along with the bug report to have one of the extension developers look into fixing this.
gd-png cannot allocate image data
I have an old code base that makes extensive use of GD (no, switching to imagemagick is not an option). It's been running for several years, through several versions. However, when I run it in my current development environment, I'm running into a mysterious gd-png error: cannot allocate image data errors when calling imagecreatefrompng(). The PNG file I'm using is the same as I've been using, so I know it works. Current setup: Ansible-provisioned vagrant box Ubuntu 14.04 PHP 5.5.9 GD 2.1.1 libPNG 1.2.50 PHP's memory limit at the time the script is run is 650M, though it's ultimately the kernel itself that ends up killing the script, and changing PHP's memory limit doesn't seem to have an effect. The image dimensions are 7200x6600 and are about 500KiB on disk. This hasn't been a problem in my other environments and is only newly-occurring in my development environment. Unfortunately, I don't have access to the other environments anymore to do a comparison, though the setup was similar in the last working one -- Ubuntu 14.04, PHP 5.5, sufficient memory allocations. What could be happening in this setup that wasn't happening in my previous setups? How can I fix this?
I was browsing a bit through the PHP 5.5.9 source to try and find your specific error string "cannot allocate image data", but the closest I could find is "gd-png error: cannot allocate gdImage struct". The bundled version of GD for PHP 5.5.9 (all the way up to 7.0.0) is version 2.0.35, so it seems you're looking at a custom build(?). Bugging the PHP guys might not help here. Looking at the GD source (2.1.2, can't seem to find 2.1.1), the only place this error occurs is: https://github.com/libgd/libgd/blob/master/src/gd_png.c#L435 image_data = (png_bytep) gdMalloc (rowbytes * height); if (!image_data) { gd_error("gd-png error: cannot allocate image data\n"); png_destroy_read_struct (&png_ptr, &info_ptr, NULL); if (im) { gdImageDestroy(im); } if (palette_allocated) { gdFree (palette); } return NULL; } Where gdMalloc is: https://github.com/libgd/libgd/blob/GD-2.1/src/gdhelpers.c#L73 void * gdMalloc (size_t size) { return malloc (size); } I'm afraid this is as far as my detective work goes. As far as I can tell the bundled 2.0.35 GD versions in PHP also just use malloc, so at first glance there's no real difference there. I also tried to find some equivalent piece of code in the bundled versions, but so far I haven't found it and it seems to be here: https://github.com/php/php-src/blob/PHP-5.5.9/ext/gd/libgd/gd_png.c#L323 image_data = (png_bytep) safe_emalloc(rowbytes, height, 0); I can't seem to find safe_emalloc, but it seems the old PHP bundled versions did use a different memory allocation here than the version your environment uses. Perhaps your best bet is to check with the GD devs?. To avoid a merry goose chase, I think trying another environment is your best bet -after confirming they are indeed using a non-strandard GD version. After some more PHP source safari, it seems safe_emalloc is used throughout all extensions, so I guess (guess mind you) that this is the preferred way to allocate memory for PHP extensions (looks like it). If your environment is in fact using an unaltered GD 2.1.1, it's likely it is ignoring any PHP memory settings/limits. You might be able to find some other way of specifying the ('stand-alone') GD memory limit? Edit: Looking at the official libgd faq, near the bottom it states that the PHP extension should indeed respect the memory limit and it specifies that an 8000x8000 pixel image would take about 256MB, so your 650MB limit should really be enough (unless you're working with multiple copies in the same run?) and the fact that it's not working corroborates that something fishy is going on with the memory allocation.
If I was you I would do the following: Send messages to the php-devel list and see if it is a known issue. You'll also want to search their bug tracker. http://php.net/mailing-lists.php and https://bugs.php.net/ Use GD command line utilities to see if you can process the same image with the same version with cmd line stuff. This is trying to determine if it's a problem with GD and the image or if it is an issue with the PHP-gd lib or some combination. 1.a Create a PHP CLI program to resize the image and see if it works Figure out exactly what happens when you call the php function in the underlying (probably c) code. This will involve downloading the source to the php-gd module and looking through it. Code a minimal c application that does the same thing as the underlying PHP-gd library and see if you can reproduce the error Something in there should tell you exactly what is going on, though it will take a bit of work. You should be able to get all the tools/sources for your environment. The beauty of open source, right? The other option would be to try different versions of these applications in your environment and see if you find one that works. This is the "shotgun" approach and not the best, IMO.
From my experience: Try a file with a different row size. I had issues with another image library that did not read images longer than 3000px/row. It was otherwise not restricted in absolute size. You have quite an image there, if this is in RGBA in memory, you end up with 180M uncompressed image data, 140M on RGB, still 45M as 8Bit. You are sure, your process limits will not be exceeded when this is loaded?
You used the same image with the same libraries, it means libgd and libpng have no restrictions about the memory size (or at least not one for the size you're using), the problem may be php but you are sure the memory limit is 650M (if your php installation is using suhosin, you may want to check suhosin.memory_limit) You tell the kernel is killing the script (but I can't find why you are telling so, is there a dmesg log about?) but the kernel kill a process only when is out of memory. One problem could be memory fragmentation / contiguous alloc, nowadays it should be more a kernel space problem than user space, but the size your script is allocating is not an every day size. Fragmentation could be a problem if your script is running on a server with a long uptime with a lot of process allocating and freeing memory, if is a virtual machine that you just start up and the script fail at the first launch, then the problem isn't fragmentation. You can do a simple test, command and code below, it use the gcc compiler to create (in the directory where you launch the command) a a.out executable which allocate and set to 0 a memory chunk with size 191 x 1024 x 1024 (about ~190M) echo '#include <stdlib.h>\n#include <string.h>\n#define SIZ 191*1024*1024\nint main() {void *p=malloc(SIZ);memset(p,0,SIZ);return (p==NULL?1:0);}' \ | gcc -O2 -xc - run the executable with: ./a.out If the problem is the system memory the kernel should kill the executable (I'm not sure 100%, the kernel have some policy about which process to kill when there's no memory, to be checked) You should see the kill message though, with dmesg perhaps. (I think this shouldn't happen, instead the malloc should just fail) If the malloc succeed, the executable can allocate enough memory, it should return 0, if it fail should return 1 to verify the return code just use ./a.out echo $? (don't execute any other command in the middle) If the malloc fail you probably just need to add physical or virtual memory to the system. Keep in mind that ~190M is just the memory for one uncompressed image, depending on how php, libgd and libpng are working the memory for the whole process may be even twice (or more). I did a simple test and profiled the memory usage, the peak with an image 7600x2200 (1.5M on disk) on my system is about ~342M, here's the test: <?php $im = imagecreatefrompng("input.png"); $string = "Sinker sucker socks pants"; $orange = imagecolorallocate($im, 220, 210, 60); $px = (imagesx($im) - 7.5 * strlen($string)) / 2; imagestring($im, 3, $px, 9, $string, $orange); imagepng($im); imagedestroy($im); I tried xhprof at first but it returned values too low. So I tried a simple script memusg memusg /usr/bin/php -f test.php >output.png (the test work btw)
I met this problem yesterday and fixed it today. Yesterday's env: php-7.0.12 libpng-1.6.26 libgd-2.1.1 This suite will crash when I resize a png image. After memory check, I thought it might be a bug in the latest php or libpng. So I changed the env to this: php-5.6.27 libpng-1.2.56 libgd-2.1.1 I changed php and libpng to mature versions which are used longtime. Recompile and re-install these - it works well for png and jpeg.
PHP GD how to gracefully handle corrupt image
I have a gallery of JPEG images I'm trying to manage using PHP's GD. One particular JPEG image is giving me trouble. It identifies itself with dimensions of 32,768 x 1,024 pixels. The image is only 1.9 MB on disk. It's handled fine by other image processing tools like Finder and Preview on my Mac, and ImageMagick. Yet, when my system calls imagecreatefromjpeg() on it, I get a classic "Allowed memory size exhausted" fatal exception. I believe the image is corrupted. It's supposed to be a 1024x1024 snapshot of a web page, created with wkhtmltoimage. Ordinarily the answer to this is to increase PHP's memory_limit. But mine is already big, at 256MB. Is there anything I can do to preemptively detect this type of image corruption and gracefully handle it? If I add an "#" before the imagecreatefromjpeg() call, PHP merely dies with "500 Internal Server Error" instead. I can't use try/catch either since it's a fatal error. FWIW, here's how ImageMagick's identify tool describes it: myimage.jpg JPEG 32768x1024 32768x1024+0+0 8-bit sRGB 1.948MB 0.000u 0:00.000 I suppose I could do if ($width == 32768) { ... }, but that's hackish. There could be an image with that width. Any other ideas?