PHP memory_get_usage() on empty PHP script - php

I decided to take a look at how much memory was being allocated to a few of my PHP scripts, and found it to be peaking at about 130KiB. Not bad, I thought, considering what was going on in the script.
Then, I decided to see what the script started at. I expected something around 32KiB.
I got 121952 bytes instead. After that, I tried testing a completely devoid script:
<?php
echo memory_get_usage();
It also started with the same amount of memory allocated.
Now, obviously, PHP is going to allocate some memory to the script before it is run, but this seems a bit excessive.
However, it doesn't seem to be dynamic at all based on how much memory is available to the system at the time. I tried consuming more system memory by opening other processes, but the pre-allocated memory amount stayed the same exact number of bytes.
Is this at all configurable on a per script basis, and how does PHP determine how much it will allocate to each script?
Using PHP Version 5.4.7
Thanks.

The memory_get_usage function directly queries PHP's memory allocator to get this information. It reports how much memory is used by PHP itself, not how much the whole process or even the system as a whole is using.
If you do not pass in an additional true argument what you get back is the exact amount of memory that your code uses; this will never be more than what memory_get_usage(true) reports.
If you do call memory_get_usage(true) you will get back the size of the heap the allocator has reserved from the system, which includes memory that has not been actually used by your code but is directly available to your code.
When your script needs more memory than what is already available to the allocator, the latter will reserve another big chunk from the OS and you will see memory_get_usage(true) jump up sharply while memory_get_usage() might only increase by a few bytes.
The exact strategy the allocator uses to decide when and how much memory to allocate is baked into PHP at compilation time; you would have to edit the source and compile PHP to change this behavior.

Related

PHP 7.2.9 out of memory at random times

We use Apache (32-bit) on Windows in combination with PHP 7.2.9. It works most of the time but after a lot of refreshing (a lot is a random number of times each time apache gets restarted) we get this error: Fatal error: Out of memory (allocated 27262976) tried to allocate 4096 bytes in [random file, always a different one] on line x.
Weird thing is that it keeps giving the exact same error until we restart Apache, then it works for a couple of hours.
Also weird is that we set 512M as memory limit in the php.ini, but it says allocated 27262976 which is (exactly) 26MB. We have 2GB+ RAM free, so that isn't the problem.
It would be great if anyone knows how to solve this.
Thanks,
Lars
Most probably the memory just gets fragmented. (I had similar issues before.) You have to let garbage collection work more while your code is running.
One way to do that
You have to identify the part of the whole process where you create the biggest arrays or objects, and split it into multiple smaller steps. I don't know what your code does, but the important part is that PHP does garbage collection at certain steps, for example when a function returns and frees up its own environment. So if you, let's say, process 10000 files in a loop, it would be helpful to implement a queue system where you put in 100 files, call a function to deal with them, then go on processing the queue. Sounds silly, I know, but makes sense if you think about it.
Another way
You can allocate same-size structures for variable-length data, like 50-100k bricks that you only partially use; this way the memory won't get as fragmented. But garbage collection is a lot better and this would typically be his job.
Last resort
When your memory is about halfway exhausted - which you can check by calling memory_get_usage(true) - serialize the big structure you're using, unset the variable, then unserialize it back. This should sort out the allocation problems.
Hope some of the above helps.

Memory leak? memory_get_usage() vs actual process memory

I have a problem with a possible memory leak in my console PHP application. It is created with the Laravel framework, using artisan. Running PHP 5.6.2. It's a huge loop that will gather data from a webservice and insert it into my own database, probably around 300k rows.
For each loop I print out memory usage in the console. The weird thing is that memory_get_usage() and memory_get_usage(true) reports that it uses roughly 13MB memory. But the php process keeps using more and more memory. If I let it run for a few hours it uses almost 1GB memory, and the loop keeps going slower and slower.
It will not terminate due to the PHP memory limit, even if it passes it by far.
I am trying to figure out why this happens and how this actually works. As I understand it, memory_get_usage reports the memory used by MY script, what I have written. So unsetting variables, cleaning up etc. should not be the problem, right? I also try to force garbage collection every ~300 entries with no luck.
Do anyone have some general tips on how I can troubleshoot this? And maybe explain why the memory used by the process is shown by the memory_get_usage function :-)
Any help is greatly appreciated.

About PHP’s memory usage

My PHP application on Windows+Apache has stopped with showing “Out of memory (allocated 422313984) (tried to allocate 45792935 bytes)”.
I can’t understand why it’s stopped because my machine has 4GB physical memory and I’ve set memory_limit directive for -1 in PHP.ini file. I’ve also restarted Apache.
I think 4GB is enough to allocate more than 422313984+45792935 byte memories.
Is there another setting to use memory for PHP or Apache?
I also summarize performance counter .It shows MAX memory usage was 2GB in total of machine. And the httpd process used 1.3GB.
I can’t show the code but actually the code fetches 30000 rows, 199 byte each, from DBMS and parsese into XML using simplexml_load_string() in a loop.
The code is normally finished if its data is small or shorten looping term like 30000 to 1000.
Another case is the first run after starting Apache will be succeeded.
I think some memory leak happen.
Actually I did echo PHP_INT_SIZE and PHP shows 4. So perhaps my PHP is 32-bit version.
If memory usage problem is from this version of PHP as Álvaro G. Vicario points at bellow, can I fix it by changing for 64-bit version of PHP? And how can I get to 64-bit version of PHP for Windows? I can’t find it in http://windows.php.net
«Out of memory» messages (not to be confused with «Allowed memory size exhausted» ones) always indicate that the PHP interpreter literally ran out of memory. There's no PHP or Apache setting you can tweak—the computer is just no able to feed PHP with more RAM. Common causes include:
Scripts that use too much memory.
Memory leaks or bugs in the PHP interpreter.
SimpleXML is a by no means a lightweight extension. On the contrary, its easy of use and handy features come at a cost: high resource consumption. Even without seeing a single line of code, I can assure that SimpleXML is totally unsuitable to create an XML file with 30,000 items. A PHP script that uses 2GB of RAM can only take down the whole server.
Nobody likes changing a base library in the middle of a project but you'll eventually need to do so. PHP provides a pull parser called XMLWriter. It's really not much harder to use and it provides two benefits:
It's way less resource intensive, since it doesn't create the complex object that SimpleXML uses.
You can flush partial results to file.
Can even write to file directly.
With it, I'm sure your 2 GB script can run with a few MB.

Understanding Xdebug memory delta increase

I have the following line in my trace file:
0.5927 12212144 2780040.00 -> require_once(E:\web\lib\nusoap\nusoap.php) E:\web\some_path\file.php:28
I know that requiring this file will cost 2.7MB of memory. Is it normal that simply requiring the file will cost that much? What impacts the memory cost when requiring a file?
I have another 13 lines that are requires and that cost at least 350 000KB of memory each. I have two more lines that cost 1MB each. Again, is this sort of thing normal?
Edit #1:
I started to look into this due to a memory leak. We have a script that will have the memory usage spike but when it comes down, there will be an increase of 10MB+ (ish) of RAM.
At one point, when Apache reaches 450 000 MB used, we start getting out of memory errors like these:
PHP Fatal error: Out of memory (allocated x) (tried to allocate y bytes) in/path_to/file.php(1758) on line z
Yes. This is quite normal. The nusoap library is quite large, but internally in PHP it is stored as a blown up binary representation. You need to realize that the require itself isn't taking up the space, but rather the included file.
I don't quite understand where your ".00" at the end comes from though. I've just checked the code and it does not create a floating point number.
cheers,
Derick
Again, is this sort of thing normal?
Yes, that is normal. If you want to understand the delta, look into the xdebug source-code it explains it fairly well. Also read the xdebug documentation first, IIRC the website tells you that you should not take these numbers for real (and it looks like in your question you do somehow).
Also take care that xdebug is not for production use. If you need production memory usage, you need other tools.

Is the memory allocated by PHP in a single request always released at the end?

I'm a bit confused about the memory leaks in PHP.
I've read that PHP is releasing automatically the memory used in each request thanks to the Zend Memory Manager:
http://www.webreference.com/programming/php_mem/2.html
But I see a lot of people and topics (even here in SO) concerned about PHP and memory leaks.
So I feel that I'm losing something.
Is it possible to have memory leaks in PHP between different requests?
It is not possible to have memory leaks from PHP scripts between different requests (when using the default Apache configuration), as the variables and code used in one request are released at the end of that request and PHP's memory allocator starts afresh for the next request. Bugs in the PHP interpreter or extensions could leak memory separately, however.
A much greater problem is that Apache child processes have PHP's memory space inside them. They swell to allocate the peak memory usage of a PHP script and then maintain this memory allocation until the child process is killed (once a process has asked the kernel to allocate a portion of memory, that memory won't be released until the process dies). For a more detailed explaination of why this is a problem and how to combat it, see my answer on Server Fault.
Memory leaks in a script, where variables are not unset and the PHP garbage collector fails, are very rare - most PHP scripts run for a few hundred milliseconds, and this is not generally enough time for even a serious memory leak to manifest.
You can monitor how much memory your PHP script is using with memory_get_usage() and memory_get_peak_usage() - there is also a good explanation on memory usage and how to program defensively in the PHP manual.
PHP's memory management is explained in detail in this article.
edit: You can determine the compiled in modules to Apache with httpd -l - defaults vary by OS distribution and repository configuration. There are plenty of ways to interface PHP to Apache - most detailed here.

Categories