I have a cheap $5/mo server, 1G ram processing some images for my website. Very rarely I will encounter a segmentation fault with PHP Imagick when writing a GIF image to disk.
I have set a memory limit on the console command hoping PHP would catch the issue first and throw an exception that I can properly handle, but this did not work.
The particular issue is certain GIF images cause it to crash at this line of code:
echo 'Writing images to disk.' . PHP_EOL;
$file = $img->writeImages($imageOnDIsk, true);//crashes here
echo 'Finished writing images to disk.' . PHP_EOL;
The specific GIF is an adult related GIF so I am not sure if I can share it.
Here is my server logs:
Setting memory limit.
Pulling URL: https://i.redd.it/gyvc8t9xdvb41.gif
Coalescing Images
Done Coalescing Images
Processing regular image.
Not comic.
Deleting temp image on disk.
Writing images to disk.
Segmentation fault (core dumped)
Seems this images width * height * bit depth * frames is using up more than 1G of memory. Need to detect this before hand.
A Segmentation fault is a generic operating system level error. As an end-user of PHP you neither have any control over it nor can you fix it. It is something that needs to be fixed at the php-src level (in this case specifically the imagick extension code).
Simply put this potentially a bug in PHP itself not in your code.
Things you can do is file a bug report with PHP. And provide a backtrace along with the bug report to have one of the extension developers look into fixing this.
Related
I have an issue with GD not creating a new JPG file, it just fails. No error messages and no indication as to what is happening. This is not new code, it has been in and working for the past six years, but all of a sudden with larger images it has started failing.
As a background this is running on an old server (to be switched off and moved to a new site on PHP8 in a couple of months time) that has PHP5.3.3 with GD version 2.0.34.
The code is creating thumbnails from the high-res image (around 24-30MB) and outputting a series of thumbnails from 150px wide to 1024px wide. It fails on all. I have increased the PHP memory limit on the page to 512MB, and set the GD.JPEG_ignore_warning flag for corrupt JPGs.
But every time with these files, this line:
$src_img = #imagecreatefromjpeg($file_path);
just returns FALSE. But never falls over with an error. The file is definitely there (run the same code with a file of the same name that is <20MB and it works fine) and there is plenty of disc space/memory available for a 60MB file to be processed in memory, so I dont see that that is the issue.
A search of Google, StackOverflow and several other sites has not produced any similar issues.
Can anyone offer any thoughts as to what the issue/solution is? I have been looking at this for two days now, and we need to resolve it - simply using smaller JPG files isn't an option for this.
Thanks to #Lessmore answer above, GD was reporting an invalid JPG file, so a further search revealed this answer on StackOverflow, which solved the problem by reading the JPG from a string, rather than file:
Reading an invalid JPG with GD
Thanks all - as ever!
My website was constantly running out of memory in random spots of code when the memory limit was set to 256M, so I changed it to 1024M ti see if it was an issue of space or some bad loop in the code... The website still ran out of memory after a while. What are some things I could do in order to not let the memory overflow?
I saw things about limiting requests but I think this does not solve the root of the problem. I will do that if it's my last option but I want to know what the best ways of troubleshooting this are.
PHP Version: 7.2.30
Apache Version: 2.4.41
Wordpress Version: 5.4.1
This is an image of the error shown on the website when the memory overflows:
This is an example of the error (Keep in mind there are about 100 of these in the log file in one day and the location of the error varies (sometimes it's in a php file in the plugins folder, sometimes it's in the themes folder)):
[16-May-2020 19:16:22 UTC] PHP Fatal error: Out of memory (allocated 21233664) (tried to allocate 4718592 bytes) in /var/www/html/wp-content/plugins/woocommerce/includes/log-handlers/class-wc-log-handler-file.php on line 21
EDIT: The logs also said that I did not have the XML service installed. I installed it but am not sure if that is the root of the problem.
Wordpress should never consume that much memory on a single load. You're saying it's a fairly standard setup. Do you get the same levels of memory usage using a vanilla installation without themes and plugins? And then, do things spike after a given plugin? Build it up from a scratch and see if you can find the culprit (e.g. buggy plugin, plugin conflict, or faulty configuration).
If you want to dig in a bit deeper under the hood, short of using more involved PHP debugging tools like Xdebug, the built-in memory_get_usage() function is your friend. You can for example use a logging function like this (off the top of my head, briefly tested):
function log_mem($file, $line, $save = true) {
static $iter = 0;
$iter++;
$usage = round(memory_get_usage() / 1048576, 2) . ' MB';
$log = "[" . time() . "] {$iter}, {$file}#{$line}, {$usage}\n";
$save && file_put_contents('tmp/php_memory.log', $log, FILE_APPEND);
return $log;
}
log_mem(__FILE__, __LINE__); // to save into log file
echo log_mem(__FILE__, __LINE__, false); // to output only
Pop the log_mem() command into suspect locations. It will log the timestamp, the iteration number (ie. processing order), file name, line number and the current memory usage into a file. Like so:
[1589662734] 1, C:\server\home\more\dev\mem_log.php#14, 0.78 MB
[1589662734] 2, C:\server\home\more\dev\mem_log.php#18, 34.78 MB
[1589662734] 3, C:\server\home\more\dev\mem_log.php#22, 68.78 MB
You can then see where the spikes are triggered, and begin to fix your code.
If you don't care to add and remove the log commands (this obviously causes some processing overhead with repeated filesys access) over and over, you can make it conditionally run with a constant boolean switch, placed into any site-wide included file:
const MEM_LOG = true;
...
MEM_LOG && log_mem(__FILE__, __LINE__);
Remember to follow up and let us know what caused the memory leak. Good luck! ^_^
People think Wordpress is easy. Even if you never touch the code, it is a very difficult system to manage and keep secure. Its complexity makes code customization very, very hard.
Your logs already shows you where the system is running out of memory.
This is an image of the error shown on the website when the memory overflows
What you said here illustrates that you are very inexperienced in operating PHP websites. Your installation should not be writing errors to the browser. This makes me think that you are way out of your depth in trying to resolve this.
You have posted this on a programming forum - implying you may be writing custom code. In which case the correct approach to resolving the error is to use a profiler to track where the memory is getting used up.
However if, in fact, there is no custom code on the site then you need to start by tracking memory usage (register a shutdown function for this) and start disabling themes and plugins until you find the problem.
I have an old code base that makes extensive use of GD (no, switching to imagemagick is not an option). It's been running for several years, through several versions. However, when I run it in my current development environment, I'm running into a mysterious gd-png error: cannot allocate image data errors when calling imagecreatefrompng(). The PNG file I'm using is the same as I've been using, so I know it works.
Current setup:
Ansible-provisioned vagrant box
Ubuntu 14.04
PHP 5.5.9
GD 2.1.1
libPNG 1.2.50
PHP's memory limit at the time the script is run is 650M, though it's ultimately the kernel itself that ends up killing the script, and changing PHP's memory limit doesn't seem to have an effect.
The image dimensions are 7200x6600 and are about 500KiB on disk. This hasn't been a problem in my other environments and is only newly-occurring in my development environment. Unfortunately, I don't have access to the other environments anymore to do a comparison, though the setup was similar in the last working one -- Ubuntu 14.04, PHP 5.5, sufficient memory allocations.
What could be happening in this setup that wasn't happening in my previous setups? How can I fix this?
I was browsing a bit through the PHP 5.5.9 source to try and find your specific error string "cannot allocate image data", but the closest I could find is "gd-png error: cannot allocate gdImage struct". The bundled version of GD for PHP 5.5.9 (all the way up to 7.0.0) is version 2.0.35, so it seems you're looking at a custom build(?). Bugging the PHP guys might not help here.
Looking at the GD source (2.1.2, can't seem to find 2.1.1), the only place this error occurs is:
https://github.com/libgd/libgd/blob/master/src/gd_png.c#L435
image_data = (png_bytep) gdMalloc (rowbytes * height);
if (!image_data) {
gd_error("gd-png error: cannot allocate image data\n");
png_destroy_read_struct (&png_ptr, &info_ptr, NULL);
if (im) {
gdImageDestroy(im);
}
if (palette_allocated) {
gdFree (palette);
}
return NULL;
}
Where gdMalloc is:
https://github.com/libgd/libgd/blob/GD-2.1/src/gdhelpers.c#L73
void *
gdMalloc (size_t size)
{
return malloc (size);
}
I'm afraid this is as far as my detective work goes. As far as I can tell the bundled 2.0.35 GD versions in PHP also just use malloc, so at first glance there's no real difference there. I also tried to find some equivalent piece of code in the bundled versions, but so far I haven't found it and it seems to be here:
https://github.com/php/php-src/blob/PHP-5.5.9/ext/gd/libgd/gd_png.c#L323
image_data = (png_bytep) safe_emalloc(rowbytes, height, 0);
I can't seem to find safe_emalloc, but it seems the old PHP bundled versions did use a different memory allocation here than the version your environment uses. Perhaps your best bet is to check with the GD devs?. To avoid a merry goose chase, I think trying another environment is your best bet -after confirming they are indeed using a non-strandard GD version.
After some more PHP source safari, it seems safe_emalloc is used throughout all extensions, so I guess (guess mind you) that this is the preferred way to allocate memory for PHP extensions (looks like it). If your environment is in fact using an unaltered GD 2.1.1, it's likely it is ignoring any PHP memory settings/limits. You might be able to find some other way of specifying the ('stand-alone') GD memory limit?
Edit: Looking at the official libgd faq, near the bottom it states that the PHP extension should indeed respect the memory limit and it specifies that an 8000x8000 pixel image would take about 256MB, so your 650MB limit should really be enough (unless you're working with multiple copies in the same run?) and the fact that it's not working corroborates that something fishy is going on with the memory allocation.
If I was you I would do the following:
Send messages to the php-devel list and see if it is a known issue. You'll also want to search their bug tracker. http://php.net/mailing-lists.php and https://bugs.php.net/
Use GD command line utilities to see if you can process the same image with the same version with cmd line stuff. This is trying to determine if it's a problem with GD and the image or if it is an issue with the PHP-gd lib or some combination.
1.a Create a PHP CLI program to resize the image and see if it works
Figure out exactly what happens when you call the php function in the underlying (probably c) code. This will involve downloading the source to the php-gd module and looking through it.
Code a minimal c application that does the same thing as the underlying PHP-gd library and see if you can reproduce the error
Something in there should tell you exactly what is going on, though it will take a bit of work. You should be able to get all the tools/sources for your environment. The beauty of open source, right?
The other option would be to try different versions of these applications in your environment and see if you find one that works. This is the "shotgun" approach and not the best, IMO.
From my experience:
Try a file with a different row size. I had issues with another image library that did not read images longer than 3000px/row. It was otherwise not restricted in absolute size.
You have quite an image there, if this is in RGBA in memory, you end up with 180M uncompressed image data, 140M on RGB, still 45M as 8Bit. You are sure, your process limits will not be exceeded when this is loaded?
You used the same image with the same libraries, it means libgd and libpng have no restrictions about the memory size (or at least not one for the size you're using), the problem may be php but you are sure the memory limit is 650M (if your php installation is using suhosin, you may want to check suhosin.memory_limit)
You tell the kernel is killing the script (but I can't find why you are telling so, is there a dmesg log about?) but the kernel kill a process only when is out of memory.
One problem could be memory fragmentation / contiguous alloc, nowadays it should be more a kernel space problem than user space, but the size your script is allocating is not an every day size.
Fragmentation could be a problem if your script is running on a server with a long uptime with a lot of process allocating and freeing memory, if is a virtual machine that you just start up and the script fail at the first launch, then the problem isn't fragmentation.
You can do a simple test, command and code below, it use the gcc compiler to create (in the directory where you launch the command) a a.out executable which allocate and set to 0 a memory chunk with size 191 x 1024 x 1024 (about ~190M)
echo '#include <stdlib.h>\n#include <string.h>\n#define SIZ 191*1024*1024\nint main() {void *p=malloc(SIZ);memset(p,0,SIZ);return (p==NULL?1:0);}' \
| gcc -O2 -xc -
run the executable with:
./a.out
If the problem is the system memory the kernel should kill the executable (I'm not sure 100%, the kernel have some policy about which process to kill when there's no memory, to be checked)
You should see the kill message though, with dmesg perhaps.
(I think this shouldn't happen, instead the malloc should just fail)
If the malloc succeed, the executable can allocate enough memory, it should return 0, if it fail should return 1
to verify the return code just use
./a.out
echo $?
(don't execute any other command in the middle)
If the malloc fail you probably just need to add physical or virtual memory to the system.
Keep in mind that ~190M is just the memory for one uncompressed image, depending on how php, libgd and libpng are working the memory for the whole process may be even twice (or more).
I did a simple test and profiled the memory usage, the peak with an image 7600x2200 (1.5M on disk) on my system is about ~342M, here's the test:
<?php
$im = imagecreatefrompng("input.png");
$string = "Sinker sucker socks pants";
$orange = imagecolorallocate($im, 220, 210, 60);
$px = (imagesx($im) - 7.5 * strlen($string)) / 2;
imagestring($im, 3, $px, 9, $string, $orange);
imagepng($im);
imagedestroy($im);
I tried xhprof at first but it returned values too low.
So I tried a simple script memusg
memusg /usr/bin/php -f test.php >output.png
(the test work btw)
I met this problem yesterday and fixed it today.
Yesterday's env:
php-7.0.12
libpng-1.6.26
libgd-2.1.1
This suite will crash when I resize a png image.
After memory check, I thought it might be a bug in the latest php or libpng.
So I changed the env to this:
php-5.6.27
libpng-1.2.56
libgd-2.1.1
I changed php and libpng to mature versions which are used longtime.
Recompile and re-install these - it works well for png and jpeg.
I currently use Imagick library on PHP and use the resizing functionality of Image Magick. I've just learned about decompression bombs and how ImageMagick is vulnerable to it.
I have checked how we can ping the image and verify the dimensions of the image without actually loading it into memory/disk. It's also safer to limit the memory and disk limits of ImageMagick so it wouldn't just write a huge file on disk.
I've read and I can do that with setResourceLimit().
http://php.net/manual/en/imagick.setresourcelimit.php
IMagick::setResourceLimit(IMagick::RESOURCETYPE_MEMORY , 100);
IMagick::setResourceLimit(IMagick::RESOURCETYPE_DISK , 100);
$thumb = new Imagick('image.png');
$thumb->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
However, what happens is that after setting the limit of disk and memory, if an image does hit this limit, all I get is a segmentation fault error, no exceptions are thrown. This makes it impossible for me to handle it properly.
Update:
Here are the package versions I'm using:
dpkg -l | grep magick
ii imagemagick-common 8:6.6.9.7-5ubuntu3.3 image manipulation programs -- infrastructure
ii libmagickcore4 8:6.6.9.7-5ubuntu3.3 low-level image manipulation library
ii libmagickwand4 8:6.6.9.7-5ubuntu3.3 image manipulation library
ii php5-imagick 3.1.0~rc1-1 ImageMagick module for php5
Setting the 'Resource Area' limit only sets the size at which images are not held in memory, and instead are paged to disk. If you want to use that setting to actually restrict the maximum size image that can be openend, you also need to set the 'Resource Disk' limit.
The code below correctly gives a memory allocation error for the image bombs taken from here.
try {
Imagick::setResourceLimit(Imagick::RESOURCETYPE_AREA, 2000 * 2000);
Imagick::setResourceLimit(Imagick::RESOURCETYPE_DISK, 2000 * 2000);
$imagick = new Imagick("./picture-100M-6000x6000.png");
$imagick->modulateImage(100, 50, 120);
$imagick->writeImage("./output.png");
echo "Complete";
}
catch(\Exception $e) {
echo "Exception: ".$e->getMessage()."\n";
}
Output is:
Exception: Memory allocation failed `./picture-100M-6000x6000.png' # error/png.c/MagickPNGErrorHandler/1630
If you want to set the width and height resource, and have a version of ImageMagick >= 6.9.0-1 you should be able to using the values directly of WidthResource = 9, HeightResource = 10
//Set max image width of 2000
Imagick::setResourceLimit(9, 2000);
//Set max image height of 1000
Imagick::setResourceLimit(10, 1000);
These don't have to be set programmatically, you can set them through the policy.xml file installed with ImageMagick. ImageMagick reads that file and uses those settings if none are set in a program - which may be a more convenient way of setting them, as you can change them per machine.
This makes it impossible for me to handle it properly.
It makes it impossible for you to handle it in the same process. You can handle it just fine by running the image processing in a background task.
Personally I think anyway that uses Imagick in a server that is accessed directly by web-browsers is nuts. It is far safer to run it in as a background task (managed by something like http://supervisord.org/) and communicating with that background task via a queue of jobs that need to be processed.
Not only does that solve the 'bad images can bring down my website' problem, it also makes it far easier to monitor resource usage, or shift the image processing to a machine with a faster CPU than a web-front end server needs.
Source - I'm the maintainer of the Imagick extension and I recently added this to the Imagick readme:
Security
The PHP extension Imagick works by calling the ImageMagick library.
Although the ImageMagick developers take good care in avoiding bugs it
is inevitable that some bugs will be present in the code. ImageMagick
also uses a lot of third party libraries to open, read and manipulate
files. The writers of these libraries also take care when writing
their code. However everyone makes mistakes and there will inevitably
be some bugs present.
Because ImageMagick is used to process images it is feasibly possible
for hackers to create images that contain invalid data to attempt to
exploit these bugs. Because of this we recommend the following:
1) Do not run Imagick in a server that is directly accessible from
outside your network. It is better to either use it as a background
task using something like SupervisorD or to run it in a separate
server that is not directly access on the internet.
Doing this will make it difficult for hackers to exploit a bug, even
if one should exist in the libraries that ImageMagick is using.
2) Run it as a very low privileged process. As much as possible the
files and system resources accessible to the PHP script that Imagick
is being called from should be locked down.
3) Check the result of the image processing is a valid image file
before displaying it to the user. In the extremely unlikely event that
a hacker is able to pipe arbitrary files to the output of Imagick,
checking that it is an image file, and not the source code of your
application that is being sent, is a sensible precaution.
Starting with ImageMagick-6.9.0-1, "width" and "height"
resource limits were added. From the commandline, use "-limit width 32000", etc. ImageMagick's PNG decoder will bail out without decompressing the image if the width or height exceeds the specified limit.
The PNG decoder will not attempt to decompress images whose width or height exceeds the limits.
The "area" resource is available in earlier versions of ImageMagick (and Imagick); however the PNG decoder does not reject images based on the "area" limit (see Danack's comment).
In versions of ImageMagick earlier than 6.9.0, the width and height limits come from libpng, and depend upon the libpng version. Current libpng versions (1.0.16 and later, 1.2.6 and later, 1.5.22 and later, and 1.6.17 and later) impose 1,000,000-column and width limits. In versions 1.2.0 through 1.2.5, 1.5.0 through 1.5.23, and 1.6.0 through 1.6.16, the limits were 2.7 billion rows and columns by default.
Look for RESOURCETYPE_AREA in Imagick (I don't see _WIDTH or _HEIGHT in the manual that you referenced, so either Imagick or its manual needs to be updated). So try
IMagick::setResourceLimit(IMagick::RESOURCETYPE_AREA , 100M);
to set a 100MegaPixel limit. Hopefully, some future version of Imagick will support RESOURCETYPE_WIDTH and RESOURCETYPE_HEIGHT, to provide a better solution to the decompression-bomb vulnerability. See Danack's answer about setting these with the current version of IMagick.
all I get is a segmentation fault error, no exceptions are thrown
I'm guessing your segment fault is from resources being set too low for ImageMagick (and related delegates) to operate. The value of the resource are in bytes, not megabytes.
Imagick does throw an exception if a resource is reached. Usually something like...
"cache resource exhausted"
Decompression bombs, or Zip-Bombs, are extremely difficult to identify. What your doing by ping-ing the image, and setting resource limits is the correct course of action. I would roughly outline a solution as...
// Define limits in application settings, or bootstrap (not dynamically!)
define('MY_MAGICK_MEMORY_LIMIT', 5e+8);
// Repeat for AREA, DISK, & etc.
// In application
$image = new Imagick(); // allocate IM structrues
// Set limits on instance
$image->setResourceLimit(Imagick::RESOURCETYPE_MEMORY, MY_MEMORY_LIMIT);
// Repeat for RESOURCETYPE_AREA, RESOURCETYPE_DISK, & etc.
$filename = 'input.png';
if($image->ping($filename)) {
// Validate that this image is what your expecting
// ...
try {
$image->read($filename); // <-- Bomb will explode here
$image->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
} catch( ImageickException $err ) {
// Handle error
}
}
unset($image)
If you don't trust decompression, you can leverage Imagick::getImageCompression during ping validation to inspect which compression an image requires. The compression type would be a integer that would map to the following enum...
typedef enum
{
UndefinedCompression,
B44ACompression,
B44Compression,
BZipCompression,
DXT1Compression,
DXT3Compression,
DXT5Compression,
FaxCompression,
Group4Compression,
JBIG1Compression,
JBIG2Compression,
JPEG2000Compression,
JPEGCompression,
LosslessJPEGCompression,
LZMACompression,
LZWCompression,
NoCompression,
PizCompression,
Pxr24Compression,
RLECompression,
ZipCompression,
ZipSCompression
} CompressionType;
MagickStudio (written in PERL) offers a good starting point for default resource limits, and how they are checked against uploaded images (search for Ping.)
I have a background script which generates html files (ea 100-500KB in size) as a by-product and when it has accumulated 500 of them, it packs them up in a .tar.gz and archives them. It was running non-stop for several weeks and generated 131 .tar.gz files thus far until this morning when it threw the following exception:
Uncaught exception 'PharException' with message 'tar-based phar
"E:/xampp/.../archive/1394109645.tar" cannot be created, contents of file
"58836.html" could not be written' in E:/xampp/.../background.php:68
The code responsible for archiving
$name = $path_archive . $set . '.tar';
$archive = new PharData($name);
$archive->buildFromDirectory($path_input); // <--- line 68
$archive->compress(Phar::GZ);
unset($archive);
unlink($name);
array_map('unlink', glob($path_input . '*'));
What I've checked and made sure of so far
I couldn't find anything irregular in the html file itself,
nothing else was touching this file during the process,
scripts timeout and memory were unlimited
and enough spare memory and disk space
What could be causing the exception and/or is there a way to get a more detailed message back from PharData::buildFromDirectory?
Env: Virtual XP (in VirtualBox) running portable XAMPP (1.8.2, PHP 5.4.25) in a shared folder of a Win7 host
I solved similar problem after hours of bug-hunting today. It was caused by too little space on one partition of the disk. I had enough space in the partition where tar.gz archive was created but after removing some log files from another partition everything works again.
I think it's possible that object PharData stores some temporary data somewhere and that's why this is happening even if there is enough space on the disk where you create tar.gz archive.