I'm having a bad time trying to work with an image with PHP.
The image is well ... big. It is a high definition picture (36MP resolution) for a total size of 23Mo.
I'm doing certain things on this picture, like resizing it or transforming it to greyscale.
The problem is, when I'm looking the memory usage using htop on a terminal, I can see that apache is using memory (a little more than 140Mb) but is not releasing it when image processing is over.
I deleted each image processing functions one by one and now have only those 3 simple lines, but the leak is still there :
$image = imagecreatefromstring( file_get_contents($imageFullPath) );
imagedestroy($image);
unset($image);
Does someone have any idea why ?
Thanks !
Jim
it is a common memory problem in php . unset($image) is not deleting image from memory it only deletes link of data. try empty image like this .
$image = NULL;
Related
I have a PHP script that creates a very tall image and draws a lot of lines on it (an organizational web kind of look). For the tallest images I've tried creating, the line drawing just stops abruptly toward the middle to bottom of the image: http://i.imgur.com/4Plgr.png
I ran into this problem using imagecreate(), then I found out that imagecreatetruecolor() can handle larger images, so I switched to that. I'm still having the same problem, but the script can now handle somewhat larger images. I think it should be drawing about 1200 lines. The script doesn't take more than 3 seconds to execute. Here's an image that executed completely: http://i.imgur.com/PaXrs.png
I adjusted the memory limits with ini_set('memory_limit', '1000M') but my scripts never reach near the limit.
How do I force the script to keep drawing until it finishes? Or how can I use PHP to create an image using less memory (which I think is the problem)?
if(sizeof($array[0])<300)
$image=imagecreate($width,$height);
else
$image=imagecreatetruecolor($width,$height);
imagefill($image,0,0,imagecolorallocate($image,255,255,255));
for($p=0; $p<sizeof($linepoints); $p++){
$posx1=77+177*$linepoints[$p][0];
$posy1=-4+46*$linepoints[$p][1];
$posx2=77+177*$linepoints[$p][2];
$posy2=-4+46*$linepoints[$p][3];
$image=draw_trail($image,$posx1,$posy1,$posx2,$posy2);
}
imagepng($image,"images/table_backgrounds/table_background".$tsn.".png",9);
imagedestroy($image);
function draw_trail($image,$posx1,$posy1,$posx2,$posy2){
$black=imagecolorallocate($image,0,0,0);
if($posy1==$posy2)
imageline($image,$posx1,$posy1,$posx2,$posy2,$black);
else{
imageline($image,$posx1,$posy1,$posx1+89,$posy1,$black);
imageline($image,$posx1+89,$posy1,$posx1+89,$posy2,$black);
imageline($image,$posx1+89,$posy2,$posx2,$posy2,$black);
}
return $image;
}
I'm going to take a guess that you have created a memory leak, and as you do more operations on a larger image you are eventually hitting PHP's memory limit. Rather than raise the limit, it would be better to find the leak.
Try changing your code so it explicitly deallocates the color you are creating in draw_trail. Also, there is no reason to return $image since you are passing a resource around.
if(sizeof($array[0])<300)
$image=imagecreate($width,$height);
else
$image=imagecreatetruecolor($width,$height);
imagefill($image,0,0,imagecolorallocate($image,255,255,255));
for($p=0; $p<sizeof($linepoints); $p++)
{
$posx1=77+177*$linepoints[$p][0];
$posy1=-4+46*$linepoints[$p][1];
$posx2=77+177*$linepoints[$p][2];
$posy2=-4+46*$linepoints[$p][3];
draw_trail($image,$posx1,$posy1,$posx2,$posy2);
}
imagepng($image,"images/table_backgrounds/table_background".$tsn.".png",9);
imagedestroy($image);
function draw_trail($image,$posx1,$posy1,$posx2,$posy2)
{
$black=imagecolorallocate($image,0,0,0);
if($posy1==$posy2)
imageline($image,$posx1,$posy1,$posx2,$posy2,$black);
else
{
imageline($image,$posx1,$posy1,$posx1+89,$posy1,$black);
imageline($image,$posx1+89,$posy1,$posx1+89,$posy2,$black);
imageline($image,$posx1+89,$posy2,$posx2,$posy2,$black);
}
imagecolordeallocate($black);
}
OP here. I figured out, or at least got around, my problem. I was drawing a lot of lines on the image, and the way my code got those points created a lot of repeats, so I still suspect it was a memory problem. I consolidated all the points, taking out the repeats and now it works just fine. Thanks to anyone that tried to help.
I'm searching for a function in PHP which extract the histogram from an image to an PNG file. This PNG file will be located in a different folder than the actual image and the function must handle large images (over 3 MB). I did find a function almost similar to my request but the function can not handle large images and it didn't showed the histogram nor the image as showed on their website (it showed only a blank window with a border).
I hope that you guys can help me with this.
Thanks in advance.
We've been using this one for our projects:
http://www.histogramgenerator.com/
We did not experience issues with large images. It's not free, but we definetly feel it's
worth the money we paid for. The class also offers many additional interesting features.
Regards
It is a script to draw a simple histogram like Photoshop does (only similar, because I suspect it scale both axes with a sigmoid function, or something like that).
I wrote a scale() function where you can use a last bool argument to do a linear histogram, or use a square root scale to boost low values.
<?php
//Just in case GD needs more memory
ini_set('memory_limit', '64M');
$filename='image1.png';
//Attempt to open
[$width, $height, $type]=getimagesize($filename);
if($type==IMAGETYPE_PNG){
$img=imagecreatefrompng($filename);
}
//Histogram initialization
$hist = array(
'red'=>array_fill(0,256,0),
'green'=>array_fill(0,256,0),
'blue'=>array_fill(0,256,0)
);
//Counting colors
for($x=0;$x<$width;++$x){
for($y=0;$y<$height;++$y){
$bytes=imagecolorat($img,$x,$y);
$colors=imagecolorsforindex($img,$bytes);
++$hist['red'][$colors['red']];
++$hist['green'][$colors['green']];
++$hist['blue'][$colors['blue']];
}
}
//Drawing histogram as a 256x128px image
$width=256;
$height=128;
$newimg=imagecreatetruecolor($width,$height);
//Max frequency for normalization
$maxr=max($hist['red']);
$maxg=max($hist['green']);
$maxb=max($hist['blue']);
$max=max($maxr,$maxg,$maxb);
function scale($value,$max,$height,$scale=FALSE){
$result=$value/$max; //normalization: value between 0 and 1
$result=$scale?$result**0.5:$result; //sqrt scale
$result=$height-round($result*$height); //scaling to image height
return $result;
}
$top=220; //255 seems too bright to me
for($x=0;$x<$width;++$x){
for($y=0;$y<$height;++$y){
$r=($y>scale($hist['red'][$x],$maxr,$height,TRUE))?$top:0;
$g=($y>scale($hist['green'][$x],$maxg,$height,TRUE))?$top:0;
$b=($y>scale($hist['blue'][$x],$maxb,$height,TRUE))?$top:0;
$colors=imagecolorallocate($newimg,$r,$g,$b);
imagesetpixel($newimg,$x,$y,$colors);
}
}
//Saving the histogram as you need
imagepng($newimg,'.subfolder/histogram.png');
//Use the next lines, and remove the previous one, to show the histogram image instead
//header('Content-Type: image/png');
//imagepng($newimg);
exit();
?>
Note I'm not checking if filename exist, neither if getimagesize() or imagecreatefrompng() failed.
I tested this with a 2MB (5800 x 5800) PNG Image. Basicaly the "imagecreatefrompng()" method is consuming lot of memory.
So before making the call, I increased the memory al the way up to 512M and set the execution time to 5 mins
ini_set('memory_limit', '512M');
set_time_limit(5*60);
After the Image is created, restore the memory limit
$im = ImageCreateFromPng($source_file);
ini_restore('memory_limit');
Reference: http://www.php.net/manual/en/function.imagecreatefrompng.php#73546
I have a simple image upload script that uses SimpleImage.php
(http://www.white-hat-web-design.co.uk/blog/resizing-images-with-php/)
to resize and save 2 copies of an uploaded image.
There isn't a massive amount of validation, just checking that it exists and that the file extension is fine, and also an exif_imagetype(); call.
This has worked with no problems so far until I tried to upload a seemingly normal jpeg which turned out to be invisibly (and untestably?) corrupt. There was something not right about it, but I know very little about image corruption - it looked fine and opened no problem on anything, but when I tried to save a scaled copy in my script I got a white page.
The problem is definitely that specific image, I've tested exhastively with other images both from my local stock and from stock image sites, and only that one image breaks it.
I resized a copy using Photoshop (the predicted file size thingy gave me some wierd numbers - 45meg for top quality jpeg) and that uploaded with no issues.
So my question is, how do I test for this?
The image in question is here: http://chinawin.co.uk/broken.jpg //beware, 700k
notes: I've tested with similar resolutions, image sizes and names, everything else worked apart from this image.
UPDATE:
Through trial and error I've narrowed down where the script breaks to the line where I load the image into a var for SimpleImage. Strangely this is the second line that does so (the first being to create the large copy, this one to create a thumbnail).
Commenting it out means the rest works ok... perhaps some refactoring will avoid this problem.
2nd Update:
Here's a snippet of code and some context from the line that fails:
//check if our image is OK
if ($image && $imageThumb)
{
//check if image is a jpeg
if (exif_imagetype($_FILES[$k]['tmp_name']) == IMAGETYPE_JPEG)
{
list($width, $height, $type, $attr) = getimagesize($_FILES[$k]['tmp_name']);
//echo 1;
$image = new SimpleImage();
//echo 2;
$image->load($_FILES[$k]['tmp_name']);
//echo 3;
$imageThumb = new SimpleImage();
//echo 4;
//this next line topples my script, but only for that one image - why?:
$imageThumb->load($_FILES[$k]['tmp_name']);
//echo '5<br/><br/>-------<br/>';
//do stuff, save & update db, etc
}
}
Final edit:
Turns out my script was running out of memory, and with good reason - 4900x3900 image with 240 ppi turns out to be around 48 meg when loaded into memory, twice - so I was using probably > 90meg of ram, per image.
Hats off to #Pekka for spotting this.
Refactoring the script to only have the image loaded once, and then this variable used instead of it's sibling, fixed my script. Still having (different) issues with upoading larger (2.5meg) images but this is for another question.
This is most likely a memory issue: Your JPG is very large (more than 4000 x 4000 pixels) and, uncompressed, will indeed eat up around 48 Megabytes of RAM.
Activate error reporting to make sure. If it's the reason, see e.g. here on what to do: Uploading images with PHP and hitting the script memory limit
Before attempting to resize an image in PHP using libGD, I'd like to check if there's enough memory available to do the operation, because an "out of memory" completely kills the PHP process and can't be catched.
My idea was that I'd need 4 byte of memory for each pixel (RGBA) in the original and in the new image:
// check available memory
if(!is_mem_available(($from_w * $from_h * 4) + ($to_w * $to_h * 4))){
return false;
}
Tests showed that this much more memory than the library really seem to use. Can anyone suggest a better method?
You should check this comment out, and also this one.
I imagine it must be possible to find out GD's peak memory usage by analyzing imagecopyresampled's source code, but this may be hard, require extended profiling, vary from version to version, and be generally unreliable.
Depending on your situation, a different approach comes to mind: When resizing an image, call another PHP script on the same server, but using http:
$file = urlencode("/path/to/file");
$result = file_get_contents("http://example.com/dir/canary.php?file=$file&width=1000&height=2000");
(sanitizing the file parameter, obviously)
If that script fails with an "out of memory" error, you'll know the image is too large.
If it successfully manages to resize the image, it could return the path to a temporary file containing the resize result. Things would go ahead normally from there.
Like so many before me, I'm writing a PHP script to do some image thumbnailing. The script has earned WOMM (works on my machine) certification, but when I move it to my host (1&1 Basic), there's an issue: images above a certain filesize cannot be processed. I've moved all operations to the filesystem, to be certain it's not some latent POST issue. Here's the relevant code:
function cropAndResizeImage( $imageLocation )
{
//
// Just to be certain
//
ini_set('display_errors','on');
error_reporting(E_ALL);
ini_set('memory_limit','128M');
ini_set('max_execution_time','300');
$image_info = getimagesize($imageLocation);
$image_width = $image_info[0];
$image_height = $image_info[1];
$image_type = $image_info[2];
switch ( $image_type )
{
// snip...
case IMAGETYPE_JPEG:
$image = imagecreatefromjpeg($imageLocation);
break;
default:
break;
}
// snip...
}
Using my mystical powers of println debugging, I've been able to determine that imagecreatefromjpeg is not returning; in fact, the script halts completely when it gets to it. Some facts:
This is correlated to filesize. Images under 1MB appear to be fine (spot-checked), but images around 3MB barf. No clue what the precise cutoff is, though.
This is not due to server timeouts; wget returns in <1s on 3MB images, significantly longer on "appropriately small" images (indicating no processing of large images).
Prefixing the function call with # to suppress errors has no effect. This matches well with the fact that the script is not throwing an error, it's simply silently terminating upon this function call.
If I had to guess, there may be some GD parameter that I don't know about (or have access to) that limits input file sizes on 1&1's servers — the config variable guess is due to the fact that it barfs immediately, and doesn't appear (heuristically) to do any actual loading of or computations on the image.
Any suggestions? Thanks for the help.
Update (courtesy #Darryl's comments): calls to phpinfo indicate that PHP is updating the max_execution_time and memory_limit variables correctly. This doesn't necessarily mean that these resources are being allocated, simply that they appear to be functioning as expected.
Update 2: following some references from The Google, I tried optimizing the JPEG (reduced in quality from 3MB to 200KB) with no luck, so it's not an image filesize issue. I then tried reducing the number of pixels of the original 3888x2592 image, and the first successful size is 1400x2592 (1401x and 1402x both result in half-parses and errors indicating "malformed JPEG", which doesn't make much sense unless the entire image isn't being loaded). By reducing further to 1300x2592, I can instantiate the 400x300 thumbnail image that I'm actually looking for; at 1400x2592, the imagecreatetruecolor call I'm using to take care of that task fails silently in the same manner as imagecreatefromjpeg.
As to why this is, I'm a little uncertain. 1400 * 2592 == 3.5MB gives nothing particularly meaningful, but I have to imagine this is a limit on the number of pixels GD + PHP will process.
Please see this note regarding memory usage on the php website.
*"The memory required to load an image using imagecreatefromjpeg() is a function of the image's dimensions and the images's bit depth, multipled by an overhead.
It can calculated from this formula:
Num bytes = Width * Height * Bytes per pixel * Overhead fudge factor"*
I'm guessing 1&1 doesn't allow you to change scripts memory_limit or max_execution_time, so it's probably running out of memory. Have you tried running phpinfo() to see what the limits are?