Today I added a new feature in the content management system I'm building. Depending on where you are uploading an image to, PHP will resize the image to fit the designated location. It works quite well, but when I try to upload a larger image, as in a 3MB image, I'm getting a fatal error:
Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 42520 bytes) in...
I'm thinking 128MB of memory is quite a lot, considering I don't run that much... at least I don't think so. It tried to allocate another 42520 bytes for the resizing process, but failed.
My question is should I (A) increase the limit or (B) re-evaluate why I'm using so much memory in the first place? Is 128MB a good number or is it too large/too little?
Thanks,
Ryan
RESOLUTION
I concluded that 128MB is really too much for resizing an image and I was so focused at looking at other options... like exec() options, that I never took a closer look at my "sample" data. Turns out, that even though my large image was only 2.83MB, it was OVER 10000px wide. That's a problem. :)
GD stores images as bitmaps in memory, so I wouldn't completely rule out the possibility that with some JPG images (for example, high resolution, highly compressed), the bitmap version could be quite large.
Call memory_get_usage() right before you open & start resizing the image & see if that amount of memory is way too big.
Also just FYI, when the script says "(tried to allocate 42520 bytes)", that doesn't mean it just needed 42520 more bytes to run successfully. It just means at that moment it needed 42520 more bytes. A bit later it might have tried to allocate more memory. So, just adding 42520 more bytes to the memory total likely wouldn't have fixed anything.
How are you resizing the image? I hope you're using a library function for it like imagecopyresampled()? If so, you shouldn't need 128M of RAM to resize a 3M image. It points to you doing something incorrectly or inefficiently.
You shouldn't ever need that much memory allocated. Re-evaluate your code to reduce it's consumption and make sure you aren't storing data superfluously.
Rough estimate of how much memory is consumed by a TrueColor image:
width x height x 4
If your users are informed about the maximum dimensions (in pixels) of an uploaded image, then you can determine the minimum RAM you have to allocate.
BTW, consider the variables being used in operations like format conversion, copying, etc.
With 64mb you have a lot to work with. If you need more you need something that starts cleaning up on used memory. 128Mb should be the max a script would need and that must be a very huge script indeed.
Related
I have a question about php memory size.
In my php.ini file, the maximum amount of memory a script may consume is 128MB (memory_limit = 128M).
But I got this error message: Fatal error: Allowed memory size of 157286400 bytes exhausted. Which means that I'm consuming 1GB of memory !
And when I added the line: ini_set('memory_limit','200M'); it worked. I don't understand how can that be possible!
Because of math. 157286400 bytes is 157.2864 Mb, nowhere near a Gb and well within the allowed 200 Mb size you set.
The answer to the riddle is "4 elephants, 2 in the front, and 2 in the back."
My point though, is that it's clearly ridiculous to be putting elephants in a mini, as a mini, or any other car, is not designed to carry elephants.
The same applies to the browser - treat it like a vehicle that will perform nicely when you give it a light load, but will struggle or break when you overload it.
I have no idea what your page is doing, but there are always ways to reduce the load - is it too much data, too many images, too much HTML that is causing you to up the memory limit?
It is possible to increase the memory, but that's just bad practice - think about what you are doing, and how you can make the page more efficient. If it's a list of 5,000 database items, for example, you could paginate the data, only displaying 50 per page. If each item shows an avatar, or some kind of image, the memory usage goes up quickly.
It's just common sense really, making pages that will load quickly, and work effortlessly
I'm using XAMPP for my project. I'm trying to upload really big images and I've noticed that it doesn't work with all images.
After trying it out a few times I came to the conclusion that images which have a higher resolution than something about 6500px in width do not upload.
I've also found out that the file size doesn't seem to matter since a 1.4MB Image with a resolution more than 6500px won't upload but another with 4.8MB but small in resolution uploads without any problem.
Somehow the reason why the image is not being uploaded is with the resolution and not with the file size.
The only code I've to show for is the upload. However there's nothing special about it. As mentioned, other images upload perfectly fine, only the ones with a too high resolution don't.
php code:
move_uploaded_file($imageUploadFile, $taget_original)
php.ini
post_max_size=10000M
upload_max_filesize=10000M
Is there any solution to this problem? Do I need to specify somewhere that I want to upload high resolution images?
This is really important since I want to be able to upload 8k to 16k images. At the moment this doesn't work even if the file size should be small enough, it won't upload the image for some reason.
I wouldn't be looking in the upload size department but in the (allowed) memory size department (e.g. memory_limit. I bet you're using ImageMagick or something to actually do something with the image.
Also see here and here. Just make sure you read the documentation because the values are supposed to be specified in bytes, not megabytes (also see the comments on those answers).
I would try something like:
$limit = 2 * (1024 * 1024 * 1024); // 2Gb
// set memory limit
ini_set(‘memory_limit’, $limit); // For testing purposes you could try -1 (for unlimited) instead of $limit
// pixel cache max size
IMagick::setResourceLimit(imagick::RESOURCETYPE_MEMORY, $limit);
// maximum amount of memory map to allocate for the pixel cache
IMagick::setResourceLimit(imagick::RESOURCETYPE_MAP, $limit);
What the actual limit is supposed to be I guess will have to be found out by trial-and-error and will also depend on the amount of memory available ofcourse. If you're on shared hosting then this might (or: most likely will) be a problem.
I had a similar case in the future. Quite a strange solution, but it worked for me.
Try to specify the size with MB, not M
upload_max_filesize = 256MB
post_max_size = 256MB
It should work. If not, try to increase memory_limit
I hope it helps
Some Updates:
I've looked into my javascript programming a little and found a few interesting non working implementations.
It seems that this was all a client side problem.. Or at least I think it is. For Some reason my onprogress function doesn't work correctly. I've tried to upload Images with a bigger delay and sometimes this worked out.. other times it didn't though.
I'm not really sure if the client side problem is causing all of this. I'll probably just have to fix the front-end issue and hope the backed issue resolves itself.
Either way I'm going to update this question as soon as I've tried to fix everything.
There are several places where this can fail:
the size of the POST allowed by the webserver (it base 64 encoded hence larger than the file size)
the time limit allowed by the webserver for a client to make a request
the max upload size allowed by PHP
the memory available in PHP to load and process the image (assuming you do anything other than move_uplaoded_file()
Except for the last of these, it has nothing to do with the dimensions of the image.
Has anyone noticed that core/modules/rest/src/Plugin/rest/resource/EntityResource.php will try to allocate a huge amount of memory (over 200MB) at the end of the post() for a large variety of (>1MB) jpeg images?
It creates a memory exception at
"$response - new ResourceResponse(NULL, 201);" to be exact. It does
not throw the EntityStorageException.
Why might the post in EntityResource.php try to allocate so much memory when trying to post an image, which, by the way, DOES get saved into files/private? What may validate() be doing?
Looks like it was a beta problem. Gone in RC2.
Im stumbled with this problem. Im merging multiple PDF-files into one depending on what PDF's the client choose.
If i choose the smallest size PDF's and merge it works fine but as soon as its a lil bigger, like around 1MB i get Fatal error: Allowed memory size of xxxxxx bytes exhausted (tried to allocate xxxx).
I know its a php.ini problem, just put it higher but i cant change it unless i pay for a bussniess account...
Is there any workaround, like lower the PDF quality size then higher it again? I really don't know what to do :S
You can try it yourself here: pdf.devharis.com
Choose the cheapest two and order them..., then try some bigger it crashes...
I'm not going to try that site because I'm not going to give you my information and I don't read whatever language that is, but I will try to answer your question.
Altering PDFs is an intensive task, if you are using cheap hosting, then it's time to upgrade. Also knowing how much memory you're allowed would be beneficial.
Is there a way to prevent the PHP GD image library from running out of memory? If too large an image is uploaded, GD tends to run out of memory, terminating the script. I'd like it to throw a catchable exception or something to that extend, but alas it doesn't.
Right now I'm using a cobbled-together script that first issues an ini_set('memory_limit', '128M'), if that works I'm usually all set. Depending on the server configuration though that may not be possible, so I'm falling back on an algorithm that tries to estimate the amount of memory needed (taking resolution, color depth, channels and a fudge factor into account), then compares it to memory_get_usage() if the function exists, otherwise does a rough estimate.
The whole thing works so far, but it's far from elegant and will fail in some edge cases, I'm sure. Is there any better way to do this, i.e. have GD fail gracefully if it has to, instead of grinding everything to a halt?
Buy more memory! :-P
Seriously though, it is impossible to handle being out of memory because any action you take would require more memory.
Your best bet is to limit the size of image being uploaded based on the current memory settings.
After you create an image.
imagepng($image);
imagedestroy($image);
will remove the memory problem
There is another way to do it, but it can be time consuming, as certain parts of the image editing process would be repeated a number of times, but you can set the memory limit to your estimated value, then try to process the image, if it fails catch the exception, increase the memory limit, then process the image again - repeating this until you succeed or reach a certain memory limit - at which point you'd throw an error message to the user explaining that their image is too big to be used.
Edit: To catch the out-of-memory error, you could use this solution: http://au2.php.net/set_error_handler#35622
To catch PHP's fatal errors, like "Out of memory" or "PHP Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate … bytes) in", see here : http://php.net/manual/en/function.set-error-handler.php#88401
Do some tests to check how much memory each gd function need.
imagecreatetruecolor seems to need width*height*5 bytes.
imagepng seems to need width*height*4 bytes.
Your best bet is to stop trying to figure out how much ram it will need, and just max it out at the outset - if you have 4 GB available, tell the image script to use between 2 and 4 GB or so, and when the script ends, have it go back to normal, that will cover off all potentially fatal situations. That's the only "Fail-safe" way I can think of anyway ...