I have a question about php memory size.
In my php.ini file, the maximum amount of memory a script may consume is 128MB (memory_limit = 128M).
But I got this error message: Fatal error: Allowed memory size of 157286400 bytes exhausted. Which means that I'm consuming 1GB of memory !
And when I added the line: ini_set('memory_limit','200M'); it worked. I don't understand how can that be possible!
Because of math. 157286400 bytes is 157.2864 Mb, nowhere near a Gb and well within the allowed 200 Mb size you set.
The answer to the riddle is "4 elephants, 2 in the front, and 2 in the back."
My point though, is that it's clearly ridiculous to be putting elephants in a mini, as a mini, or any other car, is not designed to carry elephants.
The same applies to the browser - treat it like a vehicle that will perform nicely when you give it a light load, but will struggle or break when you overload it.
I have no idea what your page is doing, but there are always ways to reduce the load - is it too much data, too many images, too much HTML that is causing you to up the memory limit?
It is possible to increase the memory, but that's just bad practice - think about what you are doing, and how you can make the page more efficient. If it's a list of 5,000 database items, for example, you could paginate the data, only displaying 50 per page. If each item shows an avatar, or some kind of image, the memory usage goes up quickly.
It's just common sense really, making pages that will load quickly, and work effortlessly
Related
Has anyone noticed that core/modules/rest/src/Plugin/rest/resource/EntityResource.php will try to allocate a huge amount of memory (over 200MB) at the end of the post() for a large variety of (>1MB) jpeg images?
It creates a memory exception at
"$response - new ResourceResponse(NULL, 201);" to be exact. It does
not throw the EntityStorageException.
Why might the post in EntityResource.php try to allocate so much memory when trying to post an image, which, by the way, DOES get saved into files/private? What may validate() be doing?
Looks like it was a beta problem. Gone in RC2.
Im stumbled with this problem. Im merging multiple PDF-files into one depending on what PDF's the client choose.
If i choose the smallest size PDF's and merge it works fine but as soon as its a lil bigger, like around 1MB i get Fatal error: Allowed memory size of xxxxxx bytes exhausted (tried to allocate xxxx).
I know its a php.ini problem, just put it higher but i cant change it unless i pay for a bussniess account...
Is there any workaround, like lower the PDF quality size then higher it again? I really don't know what to do :S
You can try it yourself here: pdf.devharis.com
Choose the cheapest two and order them..., then try some bigger it crashes...
I'm not going to try that site because I'm not going to give you my information and I don't read whatever language that is, but I will try to answer your question.
Altering PDFs is an intensive task, if you are using cheap hosting, then it's time to upgrade. Also knowing how much memory you're allowed would be beneficial.
I am using the PHPExcel framework to try to write out a very large excel document from a mysql query.
Everything works fine until I hit the 5000 row mark (or there about's) where the page gets flagged with the error:
Fatal error: Allowed memory size of xxx bytes exhausted (tried to allocate yyy bytes) in zzz on line aaa
I know this is documented, and I have adjusted the memory allocation on the server but even so I am still hitting the ceiling. I have also tried to turn off formatting, but in reality I will need it on.
So is there a way to write in small chunks, or append to the excel document so I don't exhaust the memory allocation? I am thinking along the lines of the page writing say 1000 lines, then redirect to itself and process the next 1000 using a GET to keep track. For example:
index.php?p=0
then redirect to
index.php?p=1000,
But I can't find a way to append to an existing document without opening up the whole thing.
There is no way of writing in chunks, although a common mistake is for people to load their mysql data to an array, then loop through the array setting the Excel cell data. It's more memory efficient to set the cell data as you loop through the mySQL query resultset.
If you need to keep memory usage to a minimum, what cell caching method are you using? Cell caching is slower, but can save significant amounts of memory.
Today I added a new feature in the content management system I'm building. Depending on where you are uploading an image to, PHP will resize the image to fit the designated location. It works quite well, but when I try to upload a larger image, as in a 3MB image, I'm getting a fatal error:
Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 42520 bytes) in...
I'm thinking 128MB of memory is quite a lot, considering I don't run that much... at least I don't think so. It tried to allocate another 42520 bytes for the resizing process, but failed.
My question is should I (A) increase the limit or (B) re-evaluate why I'm using so much memory in the first place? Is 128MB a good number or is it too large/too little?
Thanks,
Ryan
RESOLUTION
I concluded that 128MB is really too much for resizing an image and I was so focused at looking at other options... like exec() options, that I never took a closer look at my "sample" data. Turns out, that even though my large image was only 2.83MB, it was OVER 10000px wide. That's a problem. :)
GD stores images as bitmaps in memory, so I wouldn't completely rule out the possibility that with some JPG images (for example, high resolution, highly compressed), the bitmap version could be quite large.
Call memory_get_usage() right before you open & start resizing the image & see if that amount of memory is way too big.
Also just FYI, when the script says "(tried to allocate 42520 bytes)", that doesn't mean it just needed 42520 more bytes to run successfully. It just means at that moment it needed 42520 more bytes. A bit later it might have tried to allocate more memory. So, just adding 42520 more bytes to the memory total likely wouldn't have fixed anything.
How are you resizing the image? I hope you're using a library function for it like imagecopyresampled()? If so, you shouldn't need 128M of RAM to resize a 3M image. It points to you doing something incorrectly or inefficiently.
You shouldn't ever need that much memory allocated. Re-evaluate your code to reduce it's consumption and make sure you aren't storing data superfluously.
Rough estimate of how much memory is consumed by a TrueColor image:
width x height x 4
If your users are informed about the maximum dimensions (in pixels) of an uploaded image, then you can determine the minimum RAM you have to allocate.
BTW, consider the variables being used in operations like format conversion, copying, etc.
With 64mb you have a lot to work with. If you need more you need something that starts cleaning up on used memory. 128Mb should be the max a script would need and that must be a very huge script indeed.
Is there a way to prevent the PHP GD image library from running out of memory? If too large an image is uploaded, GD tends to run out of memory, terminating the script. I'd like it to throw a catchable exception or something to that extend, but alas it doesn't.
Right now I'm using a cobbled-together script that first issues an ini_set('memory_limit', '128M'), if that works I'm usually all set. Depending on the server configuration though that may not be possible, so I'm falling back on an algorithm that tries to estimate the amount of memory needed (taking resolution, color depth, channels and a fudge factor into account), then compares it to memory_get_usage() if the function exists, otherwise does a rough estimate.
The whole thing works so far, but it's far from elegant and will fail in some edge cases, I'm sure. Is there any better way to do this, i.e. have GD fail gracefully if it has to, instead of grinding everything to a halt?
Buy more memory! :-P
Seriously though, it is impossible to handle being out of memory because any action you take would require more memory.
Your best bet is to limit the size of image being uploaded based on the current memory settings.
After you create an image.
imagepng($image);
imagedestroy($image);
will remove the memory problem
There is another way to do it, but it can be time consuming, as certain parts of the image editing process would be repeated a number of times, but you can set the memory limit to your estimated value, then try to process the image, if it fails catch the exception, increase the memory limit, then process the image again - repeating this until you succeed or reach a certain memory limit - at which point you'd throw an error message to the user explaining that their image is too big to be used.
Edit: To catch the out-of-memory error, you could use this solution: http://au2.php.net/set_error_handler#35622
To catch PHP's fatal errors, like "Out of memory" or "PHP Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate … bytes) in", see here : http://php.net/manual/en/function.set-error-handler.php#88401
Do some tests to check how much memory each gd function need.
imagecreatetruecolor seems to need width*height*5 bytes.
imagepng seems to need width*height*4 bytes.
Your best bet is to stop trying to figure out how much ram it will need, and just max it out at the outset - if you have 4 GB available, tell the image script to use between 2 and 4 GB or so, and when the script ends, have it go back to normal, that will cover off all potentially fatal situations. That's the only "Fail-safe" way I can think of anyway ...