gd-png cannot allocate image data - php

I have an old code base that makes extensive use of GD (no, switching to imagemagick is not an option). It's been running for several years, through several versions. However, when I run it in my current development environment, I'm running into a mysterious gd-png error: cannot allocate image data errors when calling imagecreatefrompng(). The PNG file I'm using is the same as I've been using, so I know it works.
Current setup:
Ansible-provisioned vagrant box
Ubuntu 14.04
PHP 5.5.9
GD 2.1.1
libPNG 1.2.50
PHP's memory limit at the time the script is run is 650M, though it's ultimately the kernel itself that ends up killing the script, and changing PHP's memory limit doesn't seem to have an effect.
The image dimensions are 7200x6600 and are about 500KiB on disk. This hasn't been a problem in my other environments and is only newly-occurring in my development environment. Unfortunately, I don't have access to the other environments anymore to do a comparison, though the setup was similar in the last working one -- Ubuntu 14.04, PHP 5.5, sufficient memory allocations.
What could be happening in this setup that wasn't happening in my previous setups? How can I fix this?

I was browsing a bit through the PHP 5.5.9 source to try and find your specific error string "cannot allocate image data", but the closest I could find is "gd-png error: cannot allocate gdImage struct". The bundled version of GD for PHP 5.5.9 (all the way up to 7.0.0) is version 2.0.35, so it seems you're looking at a custom build(?). Bugging the PHP guys might not help here.
Looking at the GD source (2.1.2, can't seem to find 2.1.1), the only place this error occurs is:
https://github.com/libgd/libgd/blob/master/src/gd_png.c#L435
image_data = (png_bytep) gdMalloc (rowbytes * height);
if (!image_data) {
gd_error("gd-png error: cannot allocate image data\n");
png_destroy_read_struct (&png_ptr, &info_ptr, NULL);
if (im) {
gdImageDestroy(im);
}
if (palette_allocated) {
gdFree (palette);
}
return NULL;
}
Where gdMalloc is:
https://github.com/libgd/libgd/blob/GD-2.1/src/gdhelpers.c#L73
void *
gdMalloc (size_t size)
{
return malloc (size);
}
I'm afraid this is as far as my detective work goes. As far as I can tell the bundled 2.0.35 GD versions in PHP also just use malloc, so at first glance there's no real difference there. I also tried to find some equivalent piece of code in the bundled versions, but so far I haven't found it and it seems to be here:
https://github.com/php/php-src/blob/PHP-5.5.9/ext/gd/libgd/gd_png.c#L323
image_data = (png_bytep) safe_emalloc(rowbytes, height, 0);
I can't seem to find safe_emalloc, but it seems the old PHP bundled versions did use a different memory allocation here than the version your environment uses. Perhaps your best bet is to check with the GD devs?. To avoid a merry goose chase, I think trying another environment is your best bet -after confirming they are indeed using a non-strandard GD version.
After some more PHP source safari, it seems safe_emalloc is used throughout all extensions, so I guess (guess mind you) that this is the preferred way to allocate memory for PHP extensions (looks like it). If your environment is in fact using an unaltered GD 2.1.1, it's likely it is ignoring any PHP memory settings/limits. You might be able to find some other way of specifying the ('stand-alone') GD memory limit?
Edit: Looking at the official libgd faq, near the bottom it states that the PHP extension should indeed respect the memory limit and it specifies that an 8000x8000 pixel image would take about 256MB, so your 650MB limit should really be enough (unless you're working with multiple copies in the same run?) and the fact that it's not working corroborates that something fishy is going on with the memory allocation.

If I was you I would do the following:
Send messages to the php-devel list and see if it is a known issue. You'll also want to search their bug tracker. http://php.net/mailing-lists.php and https://bugs.php.net/
Use GD command line utilities to see if you can process the same image with the same version with cmd line stuff. This is trying to determine if it's a problem with GD and the image or if it is an issue with the PHP-gd lib or some combination.
1.a Create a PHP CLI program to resize the image and see if it works
Figure out exactly what happens when you call the php function in the underlying (probably c) code. This will involve downloading the source to the php-gd module and looking through it.
Code a minimal c application that does the same thing as the underlying PHP-gd library and see if you can reproduce the error
Something in there should tell you exactly what is going on, though it will take a bit of work. You should be able to get all the tools/sources for your environment. The beauty of open source, right?
The other option would be to try different versions of these applications in your environment and see if you find one that works. This is the "shotgun" approach and not the best, IMO.

From my experience:
Try a file with a different row size. I had issues with another image library that did not read images longer than 3000px/row. It was otherwise not restricted in absolute size.
You have quite an image there, if this is in RGBA in memory, you end up with 180M uncompressed image data, 140M on RGB, still 45M as 8Bit. You are sure, your process limits will not be exceeded when this is loaded?

You used the same image with the same libraries, it means libgd and libpng have no restrictions about the memory size (or at least not one for the size you're using), the problem may be php but you are sure the memory limit is 650M (if your php installation is using suhosin, you may want to check suhosin.memory_limit)
You tell the kernel is killing the script (but I can't find why you are telling so, is there a dmesg log about?) but the kernel kill a process only when is out of memory.
One problem could be memory fragmentation / contiguous alloc, nowadays it should be more a kernel space problem than user space, but the size your script is allocating is not an every day size.
Fragmentation could be a problem if your script is running on a server with a long uptime with a lot of process allocating and freeing memory, if is a virtual machine that you just start up and the script fail at the first launch, then the problem isn't fragmentation.
You can do a simple test, command and code below, it use the gcc compiler to create (in the directory where you launch the command) a a.out executable which allocate and set to 0 a memory chunk with size 191 x 1024 x 1024 (about ~190M)
echo '#include <stdlib.h>\n#include <string.h>\n#define SIZ 191*1024*1024\nint main() {void *p=malloc(SIZ);memset(p,0,SIZ);return (p==NULL?1:0);}' \
| gcc -O2 -xc -
run the executable with:
./a.out
If the problem is the system memory the kernel should kill the executable (I'm not sure 100%, the kernel have some policy about which process to kill when there's no memory, to be checked)
You should see the kill message though, with dmesg perhaps.
(I think this shouldn't happen, instead the malloc should just fail)
If the malloc succeed, the executable can allocate enough memory, it should return 0, if it fail should return 1
to verify the return code just use
./a.out
echo $?
(don't execute any other command in the middle)
If the malloc fail you probably just need to add physical or virtual memory to the system.
Keep in mind that ~190M is just the memory for one uncompressed image, depending on how php, libgd and libpng are working the memory for the whole process may be even twice (or more).
I did a simple test and profiled the memory usage, the peak with an image 7600x2200 (1.5M on disk) on my system is about ~342M, here's the test:
<?php
$im = imagecreatefrompng("input.png");
$string = "Sinker sucker socks pants";
$orange = imagecolorallocate($im, 220, 210, 60);
$px = (imagesx($im) - 7.5 * strlen($string)) / 2;
imagestring($im, 3, $px, 9, $string, $orange);
imagepng($im);
imagedestroy($im);
I tried xhprof at first but it returned values too low.
So I tried a simple script memusg
memusg /usr/bin/php -f test.php >output.png
(the test work btw)

I met this problem yesterday and fixed it today.
Yesterday's env:
php-7.0.12
libpng-1.6.26
libgd-2.1.1
This suite will crash when I resize a png image.
After memory check, I thought it might be a bug in the latest php or libpng.
So I changed the env to this:
php-5.6.27
libpng-1.2.56
libgd-2.1.1
I changed php and libpng to mature versions which are used longtime.
Recompile and re-install these - it works well for png and jpeg.

Related

ImageMagick / GraphicsMagick / libvips Images randomly corrupted

We are using ImageMagick for resizing/thumbnailing JPGs to a specific size. The source file is loaded via HTTP. It's working as expected, but from time to time some images are partially broken.
We already tried different software like GraphicsMagick or VIPS, but the problem is still there. It also only seems to happen if there are parallel processes. So the whole script is locked via sempahores, but it also does not help
We found multiple similar problems, but all without any solution: https://legacy.imagemagick.org/discourse-server/viewtopic.php?t=22506
We also wonder, why it is the same behaviour in all these softwares. We also tried different PHP versions. It seems to happen more often on source images with a huge dimension/filesize.
Any idea what to do here?
Example 1 Example 2 Example 3
I would guess the source image has been truncated for some reason. Perhaps something timed out during the download?
libvips is normally permissive, meaning that it'll try to give you something, even if the input is damaged. You can make it strict with the fail flag (ie. fail on the first warning).
For example:
$ head -c 10000 shark.jpg > truncated.jpg
$ vipsthumbnail truncated.jpg
(vipsthumbnail:9391): VIPS-WARNING **: 11:24:50.439: read gave 2 warnings
(vipsthumbnail:9391): VIPS-WARNING **: 11:24:50.439: VipsJpeg: Premature end of JPEG file
$ echo $?
0
I made a truncated jpg file, then ran thumbnail. It gave a warning, but did not fail. If I run:
$ vipsthumbnail truncated.jpg[fail]
VipsJpeg: Premature end of input file
$ echo $?
1
Or in php:
$thumb = Vips\Image::thumbnail('truncated.jpg[fail]', 128);
Now there's no output, and there's an error code. I'm sure there's an imagemagick equivalent, though I don't know it.
There's a downside: thumbnailing will now fail if there's anything wrong with the image, and it might be something you don't care about, like invalid resolution.
After some additional investigation we discovered that indeed the sourceimage was already damaged. It was downloaded via a vpn connection which was not stable enough. Sometimes the download stopped, so the JPG was only half written.

How to override "Suggested maximum size is 2500 pixels" on media upload page - wordpress

My client needs to upload high-res images for her online press kit.
She is getting this error: "Post-processing of the image failed likely because the server is busy or does not have enough resources. Uploading a smaller image may help. Suggested maximum size is 2500 pixels."
The images she wants to upload are about 2.5MB in size, and are 4272 x 2848 with 72dpi.
If I crop the images, to be 2500x1667 at 72dpi, they upload fine (meeting the 2500 pixel suggested max size.)
Is there a way to allow the larger pixel images as indicated above (4272 x 2848)?
I am not sure which php setting is the issue - I think it might be memory size, but if it is, I am not sure where to change it or what amount to set it to, to allow twice the pixel max size allowance (going from 2500 to say 5000 pixels)... or if that is even allowed.
Any help would be appreciated.
Here are my system details:
WordPress Version: 5.5.1
MySQL Version: 5.6.41
BootStrap Version: 3.3.1
PHP version 7.3.22 (Supports 64bit values)
PHP max input variables 1000
PHP time limit 30
PHP memory limit 256M
Max input time 60
Upload max filesize 256M
PHP post max size 260M
Thanks!
I ran into this problem. Disabling big_image_size_threshold didn't fix it. I think my issue is that after upgrading to PHP 7.4, the version of ImageMagick running on my host for PHP 7.4 is bad or something. I fixed the issue by using GD instead of ImageMagick. Just add this to functions.php:
add_filter('wp_image_editors', function($editors) {
return ['WP_Image_Editor_GD', 'WP_Image_Editor_Imagick'];
});
One thing to note: If you don't have GD installed, WP will default back to using Imagick. So there's little risk in making this change. Though if it doesn't resolve the problem, you might want to check that GD is actually installed.
You can use the big_image_size_threshold filter to change or disable this behavior.
https://developer.wordpress.org/reference/hooks/big_image_size_threshold/
If the original image width or height is above the threshold, it will be scaled down. The threshold is used as max width and max height. The scaled down image will be used as the largest available size, including the _wp_attached_file post meta value.
Returning false from the filter callback will disable the scaling.
This verified answer didn't work for me
The apache2 errors should be always in your
/var/log/apache2/error.log
So, you can identify the problem easily.
I fixed it by myself:
sudo apt-get install php-mbstring
sudo service apache2 restart
See this post there are many solutions for this: Link
I encountered this same issue and struggled with it for nearly a whole day - trying answers from this page and from this WordPress topic also: https://wordpress.org/support/topic/post-processing-of-the-image-failed-error/
Eventually, what resolved the problem for me was that I went to the WordPress Updates page and simply Re-installed WordPress - everything has been perfectly fine since.
I am running a network on WP version 5.7 with php 7.3
Main case : if you using any plugin in WordPress For image compress then its happens
in my case i was using WP Compress in WordPress i get most of this error when uploading image , after all i deactivate this plugin & problem fixed .

My website keeps running out of memory (fatal error)

My website was constantly running out of memory in random spots of code when the memory limit was set to 256M, so I changed it to 1024M ti see if it was an issue of space or some bad loop in the code... The website still ran out of memory after a while. What are some things I could do in order to not let the memory overflow?
I saw things about limiting requests but I think this does not solve the root of the problem. I will do that if it's my last option but I want to know what the best ways of troubleshooting this are.
PHP Version: 7.2.30
Apache Version: 2.4.41
Wordpress Version: 5.4.1
This is an image of the error shown on the website when the memory overflows:
This is an example of the error (Keep in mind there are about 100 of these in the log file in one day and the location of the error varies (sometimes it's in a php file in the plugins folder, sometimes it's in the themes folder)):
[16-May-2020 19:16:22 UTC] PHP Fatal error: Out of memory (allocated 21233664) (tried to allocate 4718592 bytes) in /var/www/html/wp-content/plugins/woocommerce/includes/log-handlers/class-wc-log-handler-file.php on line 21
EDIT: The logs also said that I did not have the XML service installed. I installed it but am not sure if that is the root of the problem.
Wordpress should never consume that much memory on a single load. You're saying it's a fairly standard setup. Do you get the same levels of memory usage using a vanilla installation without themes and plugins? And then, do things spike after a given plugin? Build it up from a scratch and see if you can find the culprit (e.g. buggy plugin, plugin conflict, or faulty configuration).
If you want to dig in a bit deeper under the hood, short of using more involved PHP debugging tools like Xdebug, the built-in memory_get_usage() function is your friend. You can for example use a logging function like this (off the top of my head, briefly tested):
function log_mem($file, $line, $save = true) {
static $iter = 0;
$iter++;
$usage = round(memory_get_usage() / 1048576, 2) . ' MB';
$log = "[" . time() . "] {$iter}, {$file}#{$line}, {$usage}\n";
$save && file_put_contents('tmp/php_memory.log', $log, FILE_APPEND);
return $log;
}
log_mem(__FILE__, __LINE__); // to save into log file
echo log_mem(__FILE__, __LINE__, false); // to output only
Pop the log_mem() command into suspect locations. It will log the timestamp, the iteration number (ie. processing order), file name, line number and the current memory usage into a file. Like so:
[1589662734] 1, C:\server\home\more\dev\mem_log.php#14, 0.78 MB
[1589662734] 2, C:\server\home\more\dev\mem_log.php#18, 34.78 MB
[1589662734] 3, C:\server\home\more\dev\mem_log.php#22, 68.78 MB
You can then see where the spikes are triggered, and begin to fix your code.
If you don't care to add and remove the log commands (this obviously causes some processing overhead with repeated filesys access) over and over, you can make it conditionally run with a constant boolean switch, placed into any site-wide included file:
const MEM_LOG = true;
...
MEM_LOG && log_mem(__FILE__, __LINE__);
Remember to follow up and let us know what caused the memory leak. Good luck! ^_^
People think Wordpress is easy. Even if you never touch the code, it is a very difficult system to manage and keep secure. Its complexity makes code customization very, very hard.
Your logs already shows you where the system is running out of memory.
This is an image of the error shown on the website when the memory overflows
What you said here illustrates that you are very inexperienced in operating PHP websites. Your installation should not be writing errors to the browser. This makes me think that you are way out of your depth in trying to resolve this.
You have posted this on a programming forum - implying you may be writing custom code. In which case the correct approach to resolving the error is to use a profiler to track where the memory is getting used up.
However if, in fact, there is no custom code on the site then you need to start by tracking memory usage (register a shutdown function for this) and start disabling themes and plugins until you find the problem.

How to prevent image bombs with ImageMagick?

I currently use Imagick library on PHP and use the resizing functionality of Image Magick. I've just learned about decompression bombs and how ImageMagick is vulnerable to it.
I have checked how we can ping the image and verify the dimensions of the image without actually loading it into memory/disk. It's also safer to limit the memory and disk limits of ImageMagick so it wouldn't just write a huge file on disk.
I've read and I can do that with setResourceLimit().
http://php.net/manual/en/imagick.setresourcelimit.php
IMagick::setResourceLimit(IMagick::RESOURCETYPE_MEMORY , 100);
IMagick::setResourceLimit(IMagick::RESOURCETYPE_DISK , 100);
$thumb = new Imagick('image.png');
$thumb->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
However, what happens is that after setting the limit of disk and memory, if an image does hit this limit, all I get is a segmentation fault error, no exceptions are thrown. This makes it impossible for me to handle it properly.
Update:
Here are the package versions I'm using:
dpkg -l | grep magick
ii imagemagick-common 8:6.6.9.7-5ubuntu3.3 image manipulation programs -- infrastructure
ii libmagickcore4 8:6.6.9.7-5ubuntu3.3 low-level image manipulation library
ii libmagickwand4 8:6.6.9.7-5ubuntu3.3 image manipulation library
ii php5-imagick 3.1.0~rc1-1 ImageMagick module for php5
Setting the 'Resource Area' limit only sets the size at which images are not held in memory, and instead are paged to disk. If you want to use that setting to actually restrict the maximum size image that can be openend, you also need to set the 'Resource Disk' limit.
The code below correctly gives a memory allocation error for the image bombs taken from here.
try {
Imagick::setResourceLimit(Imagick::RESOURCETYPE_AREA, 2000 * 2000);
Imagick::setResourceLimit(Imagick::RESOURCETYPE_DISK, 2000 * 2000);
$imagick = new Imagick("./picture-100M-6000x6000.png");
$imagick->modulateImage(100, 50, 120);
$imagick->writeImage("./output.png");
echo "Complete";
}
catch(\Exception $e) {
echo "Exception: ".$e->getMessage()."\n";
}
Output is:
Exception: Memory allocation failed `./picture-100M-6000x6000.png' # error/png.c/MagickPNGErrorHandler/1630
If you want to set the width and height resource, and have a version of ImageMagick >= 6.9.0-1 you should be able to using the values directly of WidthResource = 9, HeightResource = 10
//Set max image width of 2000
Imagick::setResourceLimit(9, 2000);
//Set max image height of 1000
Imagick::setResourceLimit(10, 1000);
These don't have to be set programmatically, you can set them through the policy.xml file installed with ImageMagick. ImageMagick reads that file and uses those settings if none are set in a program - which may be a more convenient way of setting them, as you can change them per machine.
This makes it impossible for me to handle it properly.
It makes it impossible for you to handle it in the same process. You can handle it just fine by running the image processing in a background task.
Personally I think anyway that uses Imagick in a server that is accessed directly by web-browsers is nuts. It is far safer to run it in as a background task (managed by something like http://supervisord.org/) and communicating with that background task via a queue of jobs that need to be processed.
Not only does that solve the 'bad images can bring down my website' problem, it also makes it far easier to monitor resource usage, or shift the image processing to a machine with a faster CPU than a web-front end server needs.
Source - I'm the maintainer of the Imagick extension and I recently added this to the Imagick readme:
Security
The PHP extension Imagick works by calling the ImageMagick library.
Although the ImageMagick developers take good care in avoiding bugs it
is inevitable that some bugs will be present in the code. ImageMagick
also uses a lot of third party libraries to open, read and manipulate
files. The writers of these libraries also take care when writing
their code. However everyone makes mistakes and there will inevitably
be some bugs present.
Because ImageMagick is used to process images it is feasibly possible
for hackers to create images that contain invalid data to attempt to
exploit these bugs. Because of this we recommend the following:
1) Do not run Imagick in a server that is directly accessible from
outside your network. It is better to either use it as a background
task using something like SupervisorD or to run it in a separate
server that is not directly access on the internet.
Doing this will make it difficult for hackers to exploit a bug, even
if one should exist in the libraries that ImageMagick is using.
2) Run it as a very low privileged process. As much as possible the
files and system resources accessible to the PHP script that Imagick
is being called from should be locked down.
3) Check the result of the image processing is a valid image file
before displaying it to the user. In the extremely unlikely event that
a hacker is able to pipe arbitrary files to the output of Imagick,
checking that it is an image file, and not the source code of your
application that is being sent, is a sensible precaution.
Starting with ImageMagick-6.9.0-1, "width" and "height"
resource limits were added. From the commandline, use "-limit width 32000", etc. ImageMagick's PNG decoder will bail out without decompressing the image if the width or height exceeds the specified limit.
The PNG decoder will not attempt to decompress images whose width or height exceeds the limits.
The "area" resource is available in earlier versions of ImageMagick (and Imagick); however the PNG decoder does not reject images based on the "area" limit (see Danack's comment).
In versions of ImageMagick earlier than 6.9.0, the width and height limits come from libpng, and depend upon the libpng version. Current libpng versions (1.0.16 and later, 1.2.6 and later, 1.5.22 and later, and 1.6.17 and later) impose 1,000,000-column and width limits. In versions 1.2.0 through 1.2.5, 1.5.0 through 1.5.23, and 1.6.0 through 1.6.16, the limits were 2.7 billion rows and columns by default.
Look for RESOURCETYPE_AREA in Imagick (I don't see _WIDTH or _HEIGHT in the manual that you referenced, so either Imagick or its manual needs to be updated). So try
IMagick::setResourceLimit(IMagick::RESOURCETYPE_AREA , 100M);
to set a 100MegaPixel limit. Hopefully, some future version of Imagick will support RESOURCETYPE_WIDTH and RESOURCETYPE_HEIGHT, to provide a better solution to the decompression-bomb vulnerability. See Danack's answer about setting these with the current version of IMagick.
all I get is a segmentation fault error, no exceptions are thrown
I'm guessing your segment fault is from resources being set too low for ImageMagick (and related delegates) to operate. The value of the resource are in bytes, not megabytes.
Imagick does throw an exception if a resource is reached. Usually something like...
"cache resource exhausted"
Decompression bombs, or Zip-Bombs, are extremely difficult to identify. What your doing by ping-ing the image, and setting resource limits is the correct course of action. I would roughly outline a solution as...
// Define limits in application settings, or bootstrap (not dynamically!)
define('MY_MAGICK_MEMORY_LIMIT', 5e+8);
// Repeat for AREA, DISK, & etc.
// In application
$image = new Imagick(); // allocate IM structrues
// Set limits on instance
$image->setResourceLimit(Imagick::RESOURCETYPE_MEMORY, MY_MEMORY_LIMIT);
// Repeat for RESOURCETYPE_AREA, RESOURCETYPE_DISK, & etc.
$filename = 'input.png';
if($image->ping($filename)) {
// Validate that this image is what your expecting
// ...
try {
$image->read($filename); // <-- Bomb will explode here
$image->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
} catch( ImageickException $err ) {
// Handle error
}
}
unset($image)
If you don't trust decompression, you can leverage Imagick::getImageCompression during ping validation to inspect which compression an image requires. The compression type would be a integer that would map to the following enum...
typedef enum
{
UndefinedCompression,
B44ACompression,
B44Compression,
BZipCompression,
DXT1Compression,
DXT3Compression,
DXT5Compression,
FaxCompression,
Group4Compression,
JBIG1Compression,
JBIG2Compression,
JPEG2000Compression,
JPEGCompression,
LosslessJPEGCompression,
LZMACompression,
LZWCompression,
NoCompression,
PizCompression,
Pxr24Compression,
RLECompression,
ZipCompression,
ZipSCompression
} CompressionType;
MagickStudio (written in PERL) offers a good starting point for default resource limits, and how they are checked against uploaded images (search for Ping.)

PHP ZipArchive extractTo() stalls with large files

I am working on a script that uses the extractTo method of ZipArchive to unzip some pretty large files (some are over 10G). Everything works fine for small files, but I have been testing it with files that are ~4G and noticed that unzip works up to a point, then it stops actually unzipping. However, the PHP script appears to be running. No errors or exceptions are thrown. From a terminal I sit in the folder and keep typing ls -la to watch the size of the extracted file grow. It does so for a while, then stops and the script continues to load (watching via browser and via top). The script will then run for the specified timeout period (I set to 3600) and throw a time-out error. The box is running Centos 6.6, 16G of RAM, plenty of processing power, and plenty of disc space. Seems to be crashing at 5011800064 bytes unzipped. Here are some select bits of my code:
set_time_limit(1200);
ini_set('memory_limit', '1024M');
$zip = new ZipArchive;
$res = $zip->open($zipPath);
if ($res === TRUE)
{
$extres = $zip->extractTo(
$unzipPath,
$filesToExtract
);
$zip->close();
}
Any help would be greatly appreciated. I am also curious to know if the extractTo() function tries to load the whole zip into memory? I have scoured the PHP documentation and cannot find anything relevant. All of the related posts either do not have answers and were not specific in their explanation of the problem.
Edit: Just to confirm, I have over 20G free and setting different memory limits for the script doesn't change the number of bytes unzipped.
Update: I have scoured httpd.conf, php.ini, and cannot find any settings that are prohibiting the unzip operations from working.
A traditional .zip archive is limited in size to 4GB:
The maximum size for both the archive file and the individual files inside it is 4,294,967,295 bytes (232−1 bytes, or 4 GiB minus 1 byte) for standard .ZIP, and 18,446,744,073,709,551,615 bytes (264−1 bytes, or 16 EiB minus 1 byte) for ZIP64.
To decompress larger archives using ZIP64 in PHP, you'll need to use PHP 5.6 that uses libzip 0.11.2.

Categories