GD fails to create JPG - php

I have an issue with GD not creating a new JPG file, it just fails. No error messages and no indication as to what is happening. This is not new code, it has been in and working for the past six years, but all of a sudden with larger images it has started failing.
As a background this is running on an old server (to be switched off and moved to a new site on PHP8 in a couple of months time) that has PHP5.3.3 with GD version 2.0.34.
The code is creating thumbnails from the high-res image (around 24-30MB) and outputting a series of thumbnails from 150px wide to 1024px wide. It fails on all. I have increased the PHP memory limit on the page to 512MB, and set the GD.JPEG_ignore_warning flag for corrupt JPGs.
But every time with these files, this line:
$src_img = #imagecreatefromjpeg($file_path);
just returns FALSE. But never falls over with an error. The file is definitely there (run the same code with a file of the same name that is <20MB and it works fine) and there is plenty of disc space/memory available for a 60MB file to be processed in memory, so I dont see that that is the issue.
A search of Google, StackOverflow and several other sites has not produced any similar issues.
Can anyone offer any thoughts as to what the issue/solution is? I have been looking at this for two days now, and we need to resolve it - simply using smaller JPG files isn't an option for this.

Thanks to #Lessmore answer above, GD was reporting an invalid JPG file, so a further search revealed this answer on StackOverflow, which solved the problem by reading the JPG from a string, rather than file:
Reading an invalid JPG with GD
Thanks all - as ever!

Related

ImageMagick / GraphicsMagick / libvips Images randomly corrupted

We are using ImageMagick for resizing/thumbnailing JPGs to a specific size. The source file is loaded via HTTP. It's working as expected, but from time to time some images are partially broken.
We already tried different software like GraphicsMagick or VIPS, but the problem is still there. It also only seems to happen if there are parallel processes. So the whole script is locked via sempahores, but it also does not help
We found multiple similar problems, but all without any solution: https://legacy.imagemagick.org/discourse-server/viewtopic.php?t=22506
We also wonder, why it is the same behaviour in all these softwares. We also tried different PHP versions. It seems to happen more often on source images with a huge dimension/filesize.
Any idea what to do here?
Example 1 Example 2 Example 3
I would guess the source image has been truncated for some reason. Perhaps something timed out during the download?
libvips is normally permissive, meaning that it'll try to give you something, even if the input is damaged. You can make it strict with the fail flag (ie. fail on the first warning).
For example:
$ head -c 10000 shark.jpg > truncated.jpg
$ vipsthumbnail truncated.jpg
(vipsthumbnail:9391): VIPS-WARNING **: 11:24:50.439: read gave 2 warnings
(vipsthumbnail:9391): VIPS-WARNING **: 11:24:50.439: VipsJpeg: Premature end of JPEG file
$ echo $?
0
I made a truncated jpg file, then ran thumbnail. It gave a warning, but did not fail. If I run:
$ vipsthumbnail truncated.jpg[fail]
VipsJpeg: Premature end of input file
$ echo $?
1
Or in php:
$thumb = Vips\Image::thumbnail('truncated.jpg[fail]', 128);
Now there's no output, and there's an error code. I'm sure there's an imagemagick equivalent, though I don't know it.
There's a downside: thumbnailing will now fail if there's anything wrong with the image, and it might be something you don't care about, like invalid resolution.
After some additional investigation we discovered that indeed the sourceimage was already damaged. It was downloaded via a vpn connection which was not stable enough. Sometimes the download stopped, so the JPG was only half written.

How to override "Suggested maximum size is 2500 pixels" on media upload page - wordpress

My client needs to upload high-res images for her online press kit.
She is getting this error: "Post-processing of the image failed likely because the server is busy or does not have enough resources. Uploading a smaller image may help. Suggested maximum size is 2500 pixels."
The images she wants to upload are about 2.5MB in size, and are 4272 x 2848 with 72dpi.
If I crop the images, to be 2500x1667 at 72dpi, they upload fine (meeting the 2500 pixel suggested max size.)
Is there a way to allow the larger pixel images as indicated above (4272 x 2848)?
I am not sure which php setting is the issue - I think it might be memory size, but if it is, I am not sure where to change it or what amount to set it to, to allow twice the pixel max size allowance (going from 2500 to say 5000 pixels)... or if that is even allowed.
Any help would be appreciated.
Here are my system details:
WordPress Version: 5.5.1
MySQL Version: 5.6.41
BootStrap Version: 3.3.1
PHP version 7.3.22 (Supports 64bit values)
PHP max input variables 1000
PHP time limit 30
PHP memory limit 256M
Max input time 60
Upload max filesize 256M
PHP post max size 260M
Thanks!
I ran into this problem. Disabling big_image_size_threshold didn't fix it. I think my issue is that after upgrading to PHP 7.4, the version of ImageMagick running on my host for PHP 7.4 is bad or something. I fixed the issue by using GD instead of ImageMagick. Just add this to functions.php:
add_filter('wp_image_editors', function($editors) {
return ['WP_Image_Editor_GD', 'WP_Image_Editor_Imagick'];
});
One thing to note: If you don't have GD installed, WP will default back to using Imagick. So there's little risk in making this change. Though if it doesn't resolve the problem, you might want to check that GD is actually installed.
You can use the big_image_size_threshold filter to change or disable this behavior.
https://developer.wordpress.org/reference/hooks/big_image_size_threshold/
If the original image width or height is above the threshold, it will be scaled down. The threshold is used as max width and max height. The scaled down image will be used as the largest available size, including the _wp_attached_file post meta value.
Returning false from the filter callback will disable the scaling.
This verified answer didn't work for me
The apache2 errors should be always in your
/var/log/apache2/error.log
So, you can identify the problem easily.
I fixed it by myself:
sudo apt-get install php-mbstring
sudo service apache2 restart
See this post there are many solutions for this: Link
I encountered this same issue and struggled with it for nearly a whole day - trying answers from this page and from this WordPress topic also: https://wordpress.org/support/topic/post-processing-of-the-image-failed-error/
Eventually, what resolved the problem for me was that I went to the WordPress Updates page and simply Re-installed WordPress - everything has been perfectly fine since.
I am running a network on WP version 5.7 with php 7.3
Main case : if you using any plugin in WordPress For image compress then its happens
in my case i was using WP Compress in WordPress i get most of this error when uploading image , after all i deactivate this plugin & problem fixed .

Replace image on server and show previous image until the new one is fully uploaded

I'm uploading to a lightspeed server through "ncftpput" an image taken from a raspberry camera every minute.
I want to be able to show the updated image and I know how I can force the browser to use the latest version instead of the cached image.
So, everything works properly except that, if I refresh the image (like through shift-F5) during image upload, the browser reports image contains errors (or shows image partially).
Is there any way to ensure the image is fully loaded before serving the new one?
I'm not sure if I should operate on ncftp or use PHP to ensure the swap happens only after complete loading.
Image is a progressive jpg but that doesn't help...
Any suggestion?
Thanks
I ended up with NOT using FTP because, as Viney mentioned, the webserver doesn't know if the upload is completed.
I'm using "curl" which has the advantage of being preinstalled on raspberry distro and a php upload page.
It seems that PHP will pass the new image only once fully uploaded and avoid creating the issue when image is still partially uploaded.
So, to recap:
raspberry (webcam), after having taken the image:
curl -F"somepostparam=abcd" -F"operation=upload" -F"file=#filename.jpg" https://www.myserver.com/upload.php
PHP server code:
$uploadfile = '/home/domain/myserver.com/' . basename($_FILES['file']['name']);
move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile);
$content = file_get_contents($uploadfile);
Problem is this: you open browser (at 07:00:10 AM) image.jpg gets rendered now say it's 07:01:00 you hit refresh in browser but the raspberry is already started uploading image.jpg say it would take 3 secs to complete the full upload but the server doesn't know about ftp and would read whatever bytes are there in image,jpg and flushes to your browser. Had it been a baseline JPG it would have shown a cropped(in height) image but since it's a progressive JPG it would be messed up.I am not aware of whether it's possible but try looking up if you FTP supports a locking file.
How to solve ?
The best way is to let you server know that the file it's accessing is in a write process.If your ftp supports advisory locks then you can use it so when server tries accessing you file ( via system call) kernel would instruct it that the file is currently locked so the server will wait until ftp releases that lock.
In vsftp there would be a option lock_upload_files in VSFTPD.CONF setting yest would enable this feature
If you are unable to work out above solution then you can use some trick like checking file last modified time if it's almost same as current time then you make php wait for some guessed time that you think is avg. upload time of your file.For this method you should use PHP to send it to browser instead of server. ie Just change the src of your image from '/path/to/image.jpg' to 'gen-image.php'. This script will read that image and would flush it to the browser
gen-image.php
$guessedUploadTime = 3;// guessed time that ncftpput takes to finish
$currTime = time();
$modTime = filemtime('image.jpg');
if( ($currTime - $modTime) < $guessedUploadTime)
{
sleep($guessedUploadTime);
}
$file = '/path/to/image.jpg';
$type = 'image/jpeg';
header('Content-Type:'.$type);
readfile($file);
Note that the above solution is not ideal because if file has been just done uploading it won't be modified for another 57 seconds yet the browser request say at 07:02:04 has to wait unnecessarily for 3 seconds because mtime would be 07:02:03 and browser would get file only after 07:02:06. I would recommend you to search for some way(proly cmd based) to make server and ftp go hand in hand one should know the status of the other because that is the root cause of this problem.

How to prevent image bombs with ImageMagick?

I currently use Imagick library on PHP and use the resizing functionality of Image Magick. I've just learned about decompression bombs and how ImageMagick is vulnerable to it.
I have checked how we can ping the image and verify the dimensions of the image without actually loading it into memory/disk. It's also safer to limit the memory and disk limits of ImageMagick so it wouldn't just write a huge file on disk.
I've read and I can do that with setResourceLimit().
http://php.net/manual/en/imagick.setresourcelimit.php
IMagick::setResourceLimit(IMagick::RESOURCETYPE_MEMORY , 100);
IMagick::setResourceLimit(IMagick::RESOURCETYPE_DISK , 100);
$thumb = new Imagick('image.png');
$thumb->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
However, what happens is that after setting the limit of disk and memory, if an image does hit this limit, all I get is a segmentation fault error, no exceptions are thrown. This makes it impossible for me to handle it properly.
Update:
Here are the package versions I'm using:
dpkg -l | grep magick
ii imagemagick-common 8:6.6.9.7-5ubuntu3.3 image manipulation programs -- infrastructure
ii libmagickcore4 8:6.6.9.7-5ubuntu3.3 low-level image manipulation library
ii libmagickwand4 8:6.6.9.7-5ubuntu3.3 image manipulation library
ii php5-imagick 3.1.0~rc1-1 ImageMagick module for php5
Setting the 'Resource Area' limit only sets the size at which images are not held in memory, and instead are paged to disk. If you want to use that setting to actually restrict the maximum size image that can be openend, you also need to set the 'Resource Disk' limit.
The code below correctly gives a memory allocation error for the image bombs taken from here.
try {
Imagick::setResourceLimit(Imagick::RESOURCETYPE_AREA, 2000 * 2000);
Imagick::setResourceLimit(Imagick::RESOURCETYPE_DISK, 2000 * 2000);
$imagick = new Imagick("./picture-100M-6000x6000.png");
$imagick->modulateImage(100, 50, 120);
$imagick->writeImage("./output.png");
echo "Complete";
}
catch(\Exception $e) {
echo "Exception: ".$e->getMessage()."\n";
}
Output is:
Exception: Memory allocation failed `./picture-100M-6000x6000.png' # error/png.c/MagickPNGErrorHandler/1630
If you want to set the width and height resource, and have a version of ImageMagick >= 6.9.0-1 you should be able to using the values directly of WidthResource = 9, HeightResource = 10
//Set max image width of 2000
Imagick::setResourceLimit(9, 2000);
//Set max image height of 1000
Imagick::setResourceLimit(10, 1000);
These don't have to be set programmatically, you can set them through the policy.xml file installed with ImageMagick. ImageMagick reads that file and uses those settings if none are set in a program - which may be a more convenient way of setting them, as you can change them per machine.
This makes it impossible for me to handle it properly.
It makes it impossible for you to handle it in the same process. You can handle it just fine by running the image processing in a background task.
Personally I think anyway that uses Imagick in a server that is accessed directly by web-browsers is nuts. It is far safer to run it in as a background task (managed by something like http://supervisord.org/) and communicating with that background task via a queue of jobs that need to be processed.
Not only does that solve the 'bad images can bring down my website' problem, it also makes it far easier to monitor resource usage, or shift the image processing to a machine with a faster CPU than a web-front end server needs.
Source - I'm the maintainer of the Imagick extension and I recently added this to the Imagick readme:
Security
The PHP extension Imagick works by calling the ImageMagick library.
Although the ImageMagick developers take good care in avoiding bugs it
is inevitable that some bugs will be present in the code. ImageMagick
also uses a lot of third party libraries to open, read and manipulate
files. The writers of these libraries also take care when writing
their code. However everyone makes mistakes and there will inevitably
be some bugs present.
Because ImageMagick is used to process images it is feasibly possible
for hackers to create images that contain invalid data to attempt to
exploit these bugs. Because of this we recommend the following:
1) Do not run Imagick in a server that is directly accessible from
outside your network. It is better to either use it as a background
task using something like SupervisorD or to run it in a separate
server that is not directly access on the internet.
Doing this will make it difficult for hackers to exploit a bug, even
if one should exist in the libraries that ImageMagick is using.
2) Run it as a very low privileged process. As much as possible the
files and system resources accessible to the PHP script that Imagick
is being called from should be locked down.
3) Check the result of the image processing is a valid image file
before displaying it to the user. In the extremely unlikely event that
a hacker is able to pipe arbitrary files to the output of Imagick,
checking that it is an image file, and not the source code of your
application that is being sent, is a sensible precaution.
Starting with ImageMagick-6.9.0-1, "width" and "height"
resource limits were added. From the commandline, use "-limit width 32000", etc. ImageMagick's PNG decoder will bail out without decompressing the image if the width or height exceeds the specified limit.
The PNG decoder will not attempt to decompress images whose width or height exceeds the limits.
The "area" resource is available in earlier versions of ImageMagick (and Imagick); however the PNG decoder does not reject images based on the "area" limit (see Danack's comment).
In versions of ImageMagick earlier than 6.9.0, the width and height limits come from libpng, and depend upon the libpng version. Current libpng versions (1.0.16 and later, 1.2.6 and later, 1.5.22 and later, and 1.6.17 and later) impose 1,000,000-column and width limits. In versions 1.2.0 through 1.2.5, 1.5.0 through 1.5.23, and 1.6.0 through 1.6.16, the limits were 2.7 billion rows and columns by default.
Look for RESOURCETYPE_AREA in Imagick (I don't see _WIDTH or _HEIGHT in the manual that you referenced, so either Imagick or its manual needs to be updated). So try
IMagick::setResourceLimit(IMagick::RESOURCETYPE_AREA , 100M);
to set a 100MegaPixel limit. Hopefully, some future version of Imagick will support RESOURCETYPE_WIDTH and RESOURCETYPE_HEIGHT, to provide a better solution to the decompression-bomb vulnerability. See Danack's answer about setting these with the current version of IMagick.
all I get is a segmentation fault error, no exceptions are thrown
I'm guessing your segment fault is from resources being set too low for ImageMagick (and related delegates) to operate. The value of the resource are in bytes, not megabytes.
Imagick does throw an exception if a resource is reached. Usually something like...
"cache resource exhausted"
Decompression bombs, or Zip-Bombs, are extremely difficult to identify. What your doing by ping-ing the image, and setting resource limits is the correct course of action. I would roughly outline a solution as...
// Define limits in application settings, or bootstrap (not dynamically!)
define('MY_MAGICK_MEMORY_LIMIT', 5e+8);
// Repeat for AREA, DISK, & etc.
// In application
$image = new Imagick(); // allocate IM structrues
// Set limits on instance
$image->setResourceLimit(Imagick::RESOURCETYPE_MEMORY, MY_MEMORY_LIMIT);
// Repeat for RESOURCETYPE_AREA, RESOURCETYPE_DISK, & etc.
$filename = 'input.png';
if($image->ping($filename)) {
// Validate that this image is what your expecting
// ...
try {
$image->read($filename); // <-- Bomb will explode here
$image->resizeImage(320,240,Imagick::FILTER_LANCZOS,1);
} catch( ImageickException $err ) {
// Handle error
}
}
unset($image)
If you don't trust decompression, you can leverage Imagick::getImageCompression during ping validation to inspect which compression an image requires. The compression type would be a integer that would map to the following enum...
typedef enum
{
UndefinedCompression,
B44ACompression,
B44Compression,
BZipCompression,
DXT1Compression,
DXT3Compression,
DXT5Compression,
FaxCompression,
Group4Compression,
JBIG1Compression,
JBIG2Compression,
JPEG2000Compression,
JPEGCompression,
LosslessJPEGCompression,
LZMACompression,
LZWCompression,
NoCompression,
PizCompression,
Pxr24Compression,
RLECompression,
ZipCompression,
ZipSCompression
} CompressionType;
MagickStudio (written in PERL) offers a good starting point for default resource limits, and how they are checked against uploaded images (search for Ping.)

PHP File Upload corrupted JPEGS

We have a web app using Andrew Valums ajax file uploader, if we kick off 5 - 10 image uploads at once, more often then not at least 2 or 3 will result in the same gd error "Corrupt JPEG data"
Warning: imagecreatefromjpeg() [function.imagecreatefromjpeg]:
gd-jpeg, libjpeg: recoverable error: Corrupt JPEG data:
47 extraneous bytes before marker 0xd9 in ....
However this did not happen on our old test server, or local development box's, only on our new production server.
The file size on the server is the same as the original on my local machine, so it completes the upload but I think the data is being corrupted by the server.
I can "fix" the broken files by deleting them and uploading again, or manually uploading via FTP
We had a shared host on Godaddy and just have started to have this issue on a new box (that I set up, so probably explains a lot :) CentOS 5.5+, Apache 2.2.3, PHP 5.2.10
You can see some example good and bad picture here. http://174.127.115.220/temp/pics.zip
When I BinDiffed them I see a consistent pattern the corruption is always 64 byte blocks, and while the distance between corrupted blocks is not constant the number 4356 comes up a lot.
I really think we can rule out the Internet as error checking and retransmission with TCP is pretty reliable, further there seems to be no difference between browser versions, or if I turn anti-virus and firewalls off.
So I'm picking configuration of Apache / PHP?
Some cameras will append some data inside the file that will get interpreted incorrectly (most likely do to character encoding with in the headers).
A solution I found was to read the file in binary mode like so
$fh = fopen('test.jpg', 'rb');
$str = '';
while($fh !== false && !feof($fh)){
$str .= fread($fh, 1024);
}
$test = #imagecreatefromstring($str);
imagepng($test,'save.png');
Well, i think the problem is jpeg-header data, and as far as i know there is nothing to do with it by PHP, i think the problem is your fileuploader, maybe there are some configuration for it that you are missing.
Hmm - a 64 byte corruption?...or did you mean 64 bit?
I'm going to suggest that the issue is in fact as a result of the PHP script. the problem that regularly comes up here is that the script inserts CRLFs into the data stream being uploaded, and is caused by differences between the Window/*nix standards.
Solution is to force the php script to upload in binary mode (use the +b switch for ALL fopen() commands in the php upload). It is safe to upload a text file in binary mode as at least you can still see the data.
Read here for more information on this issue:
http://us2.php.net/manual/en/function.fopen.php
This can be solved with:
ini_set ('gd.jpeg_ignore_warning', 1);
I had this problem with GoDaddy hosting.
I had created the database on GoDaddy using their cPanel interface. It was created as "latin collation" (or something like that). The database on the development server was UTF8. I've tried all solutions on this page, to no avail. Then I converted the database to UTF8, and it worked.
Database encoding shouldn't affect BLOB data (or so I would think). BLOB stands for BINARY Large Object (something...), to my knowledge!
Also, strangely, the data was copied from the dev to production server while the database was still "latin", and it was not corrupted at all. It's only when inserting new images that the problem appeared. So I guess the image data was being fed to MySQL as text data, and I think there is a way (when using SQL) of inserting binary data, and I did not follow it.
Edit: just took a look at the MySQL export script, here it is:
INSERT INTO ... VALUES (..., _binary 0xFFD8FF ...
Anyway, hope this will help someone. The OP did not indicate what solved his problem...

Categories