I have the following line in my trace file:
0.5927 12212144 2780040.00 -> require_once(E:\web\lib\nusoap\nusoap.php) E:\web\some_path\file.php:28
I know that requiring this file will cost 2.7MB of memory. Is it normal that simply requiring the file will cost that much? What impacts the memory cost when requiring a file?
I have another 13 lines that are requires and that cost at least 350 000KB of memory each. I have two more lines that cost 1MB each. Again, is this sort of thing normal?
Edit #1:
I started to look into this due to a memory leak. We have a script that will have the memory usage spike but when it comes down, there will be an increase of 10MB+ (ish) of RAM.
At one point, when Apache reaches 450 000 MB used, we start getting out of memory errors like these:
PHP Fatal error: Out of memory (allocated x) (tried to allocate y bytes) in/path_to/file.php(1758) on line z
Yes. This is quite normal. The nusoap library is quite large, but internally in PHP it is stored as a blown up binary representation. You need to realize that the require itself isn't taking up the space, but rather the included file.
I don't quite understand where your ".00" at the end comes from though. I've just checked the code and it does not create a floating point number.
cheers,
Derick
Again, is this sort of thing normal?
Yes, that is normal. If you want to understand the delta, look into the xdebug source-code it explains it fairly well. Also read the xdebug documentation first, IIRC the website tells you that you should not take these numbers for real (and it looks like in your question you do somehow).
Also take care that xdebug is not for production use. If you need production memory usage, you need other tools.
Related
We use Apache (32-bit) on Windows in combination with PHP 7.2.9. It works most of the time but after a lot of refreshing (a lot is a random number of times each time apache gets restarted) we get this error: Fatal error: Out of memory (allocated 27262976) tried to allocate 4096 bytes in [random file, always a different one] on line x.
Weird thing is that it keeps giving the exact same error until we restart Apache, then it works for a couple of hours.
Also weird is that we set 512M as memory limit in the php.ini, but it says allocated 27262976 which is (exactly) 26MB. We have 2GB+ RAM free, so that isn't the problem.
It would be great if anyone knows how to solve this.
Thanks,
Lars
Most probably the memory just gets fragmented. (I had similar issues before.) You have to let garbage collection work more while your code is running.
One way to do that
You have to identify the part of the whole process where you create the biggest arrays or objects, and split it into multiple smaller steps. I don't know what your code does, but the important part is that PHP does garbage collection at certain steps, for example when a function returns and frees up its own environment. So if you, let's say, process 10000 files in a loop, it would be helpful to implement a queue system where you put in 100 files, call a function to deal with them, then go on processing the queue. Sounds silly, I know, but makes sense if you think about it.
Another way
You can allocate same-size structures for variable-length data, like 50-100k bricks that you only partially use; this way the memory won't get as fragmented. But garbage collection is a lot better and this would typically be his job.
Last resort
When your memory is about halfway exhausted - which you can check by calling memory_get_usage(true) - serialize the big structure you're using, unset the variable, then unserialize it back. This should sort out the allocation problems.
Hope some of the above helps.
My PHP application on Windows+Apache has stopped with showing “Out of memory (allocated 422313984) (tried to allocate 45792935 bytes)”.
I can’t understand why it’s stopped because my machine has 4GB physical memory and I’ve set memory_limit directive for -1 in PHP.ini file. I’ve also restarted Apache.
I think 4GB is enough to allocate more than 422313984+45792935 byte memories.
Is there another setting to use memory for PHP or Apache?
I also summarize performance counter .It shows MAX memory usage was 2GB in total of machine. And the httpd process used 1.3GB.
I can’t show the code but actually the code fetches 30000 rows, 199 byte each, from DBMS and parsese into XML using simplexml_load_string() in a loop.
The code is normally finished if its data is small or shorten looping term like 30000 to 1000.
Another case is the first run after starting Apache will be succeeded.
I think some memory leak happen.
Actually I did echo PHP_INT_SIZE and PHP shows 4. So perhaps my PHP is 32-bit version.
If memory usage problem is from this version of PHP as Álvaro G. Vicario points at bellow, can I fix it by changing for 64-bit version of PHP? And how can I get to 64-bit version of PHP for Windows? I can’t find it in http://windows.php.net
«Out of memory» messages (not to be confused with «Allowed memory size exhausted» ones) always indicate that the PHP interpreter literally ran out of memory. There's no PHP or Apache setting you can tweak—the computer is just no able to feed PHP with more RAM. Common causes include:
Scripts that use too much memory.
Memory leaks or bugs in the PHP interpreter.
SimpleXML is a by no means a lightweight extension. On the contrary, its easy of use and handy features come at a cost: high resource consumption. Even without seeing a single line of code, I can assure that SimpleXML is totally unsuitable to create an XML file with 30,000 items. A PHP script that uses 2GB of RAM can only take down the whole server.
Nobody likes changing a base library in the middle of a project but you'll eventually need to do so. PHP provides a pull parser called XMLWriter. It's really not much harder to use and it provides two benefits:
It's way less resource intensive, since it doesn't create the complex object that SimpleXML uses.
You can flush partial results to file.
Can even write to file directly.
With it, I'm sure your 2 GB script can run with a few MB.
I decided to take a look at how much memory was being allocated to a few of my PHP scripts, and found it to be peaking at about 130KiB. Not bad, I thought, considering what was going on in the script.
Then, I decided to see what the script started at. I expected something around 32KiB.
I got 121952 bytes instead. After that, I tried testing a completely devoid script:
<?php
echo memory_get_usage();
It also started with the same amount of memory allocated.
Now, obviously, PHP is going to allocate some memory to the script before it is run, but this seems a bit excessive.
However, it doesn't seem to be dynamic at all based on how much memory is available to the system at the time. I tried consuming more system memory by opening other processes, but the pre-allocated memory amount stayed the same exact number of bytes.
Is this at all configurable on a per script basis, and how does PHP determine how much it will allocate to each script?
Using PHP Version 5.4.7
Thanks.
The memory_get_usage function directly queries PHP's memory allocator to get this information. It reports how much memory is used by PHP itself, not how much the whole process or even the system as a whole is using.
If you do not pass in an additional true argument what you get back is the exact amount of memory that your code uses; this will never be more than what memory_get_usage(true) reports.
If you do call memory_get_usage(true) you will get back the size of the heap the allocator has reserved from the system, which includes memory that has not been actually used by your code but is directly available to your code.
When your script needs more memory than what is already available to the allocator, the latter will reserve another big chunk from the OS and you will see memory_get_usage(true) jump up sharply while memory_get_usage() might only increase by a few bytes.
The exact strategy the allocator uses to decide when and how much memory to allocate is baked into PHP at compilation time; you would have to edit the source and compile PHP to change this behavior.
I'm trying to troubleshoot a memory issue I've run into with Wordpress and rather than bore you with the whole problem I was hoping to get a nice compact answer to three parts of my problem:
Normal Memory Footprint. I know there is no real "normal" Wordpress script and yet I think it would be quite useful to hear from people what a typical Wordpress script's memory footprint is. Let's call "normal" for sake of argument as a installation with very few plugins, a base type theme like twenty-twelve, and a script that has some DB retrieval but nothing monumental ... maybe a typical blog roll page or something. What I'm trying to understand is what is the baseline memory footprint (a range not a discrete number) that a more complicated script would be starting from?
Memory Ceiling Versus memory_get_usage(). I have been putting lots of logging in my scripts that pull out the memory usage by using PHP's memory_get_usage(true) call. This seems like one of the few troubleshooting techniques that determine where the memory is being used but what perplexes me is that I see memory usage ranging from 15M to 45M at the script level -- note this is with the "true" parameter so this includes the overhead of the memory manager - and yet in many instances I'll see a 27M script all of a sudden fall over with the message that the "Allowed memory size of 268435456 bytes exhausted". It is possible that maybe there is one very large memory request that takes place after the logging but I'm interested to hear if other people have found any differences between the memory limit and the memory reported by memory_get_usage()?
New Memory Ceiling Ignored. In a desperate attempt to get the site back to working -- and buy me time to troubleshoot -- I thought I'd just up the memory limit in the php.ini file to 512M but doing this seems to have had no impact. The fatal error continues to talk about the old 256M limit.
Any help would be appreciated. Thanks in advance.
Hopefully someone can answers your question so detailed. By my side:
Q: What is a normal amount of memory for a Wordpress script to use?
A1.- As a WP is a plugin driven CMS, memory depends on these plugins. As you must know there exists very bad coded ones. But an out-of-box WP has a very good performance.
A2.- To try helping you to find bottlenecks I recommend you to use BlackBox (WordPress Debug Bar Plugin )
... As for information you will find in profiler, these are: time passed
since profiler was started and total memory WordPress was using when
checkpoint was reached ...
I just found this interesting article:
WordPress Memory Usage & Website Outage Issues Resolved.
I ran a test for Wordpress 4.4 with a clean install on a windows 7 PC (a local install).
Memory Used / Allocated:
9.37 MB / 9.5 MB
Total Files: 89
Total File Size: 2923.38 KB
Ran in 1.27507 seconds
This was all done in the index file, timing before anything is called and memory / file usage after everything is 100% finished.
I tried a few pages (category, archive, single post, etc..) and all were very similar (within 1% difference) in files and memory usage.
I think it stands to reason this would be the best possible performance, so adding plugins /content will only bump these numbers up. May be possible a caching plugin would offer a little better performance though.
All,
I am working on a Zend Framework based web application. We keep encountering out of memory errors on our dev server:
Allowed memory size of XXXX bytes exhausted (tried YYYY...
We keep increasing memory_limit in php.ini, but it is now up over 1000 megs. What is a normal memory_limit value? What are the usual suspects in php/Zend for running out of memory? We are using the Propel ORM.
Thanks for all of the help!
Update
I cannot reproduce this error in my windows environment. If I set memory_limit low (say 16M), I get the same error, but the "tried to allocate" amount is always something reasonable. For example:
(tried to allocate 13344 bytes)
If I set the memory very low on the (Fedora 9) server (such as 16M), I get the same thing. consistent, reasonable out of memory errors. However, even when the memory limit is set very high on our server (128M, for example), maybe once a week, I will get an crazy huge memory error: (tried to allocate 1846026201 bytes). I don't know if that might shed any more light onto what is going on. We are using propel 1.5. It sounds like the actual release is going to come out later this month, but it doesn't look like anyone else is having this problem with it anyway. I don't know that Propel is the problem. We are using Zend Server with php 5.2 on the Linux box, and 5.3 locally.
Any more ideas? I have a ticket out to get Xdebug installed on the Linux box.
Thanks,
-rep
Generally speaking, with PHP 5.2 and/or PHP 5.3, I tend to consider than more than 32M for memory_limit is "too much" :
Using Frameworks / ORM and stuff like this, 16M is often not enough
Using 32M is generally enough for the kind of web-applications I'm working on (typical websites)
Using more than 64M means the server will not be able to handle as many users as we'd like.
When, it comes to a script reaching memory_limit, the usual problem is trying to load too much data into memory ; a couple of examples :
Loading a big file in memory, with functions such as file or file_get_contents, or XML-related functions/classes
Creating a too big array of data
Creating too many objects
Considering you are using an ORM, you might be in a situation where :
You are doing some SQL query that returns a lot of rows
Your ORM is converting each row in objects, putting those in an array
In which case a solution would be to load less data
using pagination, for instance
or trying to load data as arrays instead of objects (I don't know if this is possible with Propel -- but it is with Doctrine ; so maybe Propel has some way of doing that too ? )
What exactly is your application doing at the time it runs out of memory. There can be a lot of causes for this. I'd say most common would be allocating too much data to an array. Is your application doing anything along those lines.
You have one of two things happening, perhaps both:
You have a runaway process somewhere that isn't ending when it should be.
You have algorithms that throw lots of data around, such as huge strings or arrays or objects, and are making needless copies instead of processing just what they need and discarding what they don't.
I think this has something to do with deployment from cruise control. I only get the very high (on the order of gigs) memory error when someone is deploying new code (or just after new code has been deployed). This makes a little bit of sense too since the error always points to a line that is a "require_once." Each time I get an error:
Fatal error: Out of memory (allocated 4456448) (tried to allocate 3949907977 bytes) in /directory/file.php on line 2
I have replaced the "require_once" line with:
class_exists('Ingrain_Security_Auth') || require('Ingrain/Security/Auth.php');
I have replaced that line in 3 files so far, and have not had any more memory issues. Can anyone shed some light into what might be going on? I am using Cruise Control to deploy.