I have a problem with a possible memory leak in my console PHP application. It is created with the Laravel framework, using artisan. Running PHP 5.6.2. It's a huge loop that will gather data from a webservice and insert it into my own database, probably around 300k rows.
For each loop I print out memory usage in the console. The weird thing is that memory_get_usage() and memory_get_usage(true) reports that it uses roughly 13MB memory. But the php process keeps using more and more memory. If I let it run for a few hours it uses almost 1GB memory, and the loop keeps going slower and slower.
It will not terminate due to the PHP memory limit, even if it passes it by far.
I am trying to figure out why this happens and how this actually works. As I understand it, memory_get_usage reports the memory used by MY script, what I have written. So unsetting variables, cleaning up etc. should not be the problem, right? I also try to force garbage collection every ~300 entries with no luck.
Do anyone have some general tips on how I can troubleshoot this? And maybe explain why the memory used by the process is shown by the memory_get_usage function :-)
Any help is greatly appreciated.
Related
I have a zf2 php application which is executed in a bash script every minute. This is running inside an ec2 instance.
here's my code
while :
do
php public/index.php start-processor &
wait
sleep 60
done
Metrics Reading
Based on the metrics it keeps on leaking memory until it reaches 100% then drops. Is this normal or is there really a leak happening to my application?
I've also tried using htops and it looks fine and does not eat memory that much.
Hope someone could explain what is happening here. Should I worry about this?
Thanks and more power.
It does not look like a memory leak to me, there the used amount would just rise and never go back, causing you app to eventually crash.
This graph looks very similar to garbage collection as it's happening in JVM, does you PHP use such thing under the hood? I searched the web and looks like PHP 5.3+ has GC built in: https://secure.php.net/manual/en/features.gc.php
We use Apache (32-bit) on Windows in combination with PHP 7.2.9. It works most of the time but after a lot of refreshing (a lot is a random number of times each time apache gets restarted) we get this error: Fatal error: Out of memory (allocated 27262976) tried to allocate 4096 bytes in [random file, always a different one] on line x.
Weird thing is that it keeps giving the exact same error until we restart Apache, then it works for a couple of hours.
Also weird is that we set 512M as memory limit in the php.ini, but it says allocated 27262976 which is (exactly) 26MB. We have 2GB+ RAM free, so that isn't the problem.
It would be great if anyone knows how to solve this.
Thanks,
Lars
Most probably the memory just gets fragmented. (I had similar issues before.) You have to let garbage collection work more while your code is running.
One way to do that
You have to identify the part of the whole process where you create the biggest arrays or objects, and split it into multiple smaller steps. I don't know what your code does, but the important part is that PHP does garbage collection at certain steps, for example when a function returns and frees up its own environment. So if you, let's say, process 10000 files in a loop, it would be helpful to implement a queue system where you put in 100 files, call a function to deal with them, then go on processing the queue. Sounds silly, I know, but makes sense if you think about it.
Another way
You can allocate same-size structures for variable-length data, like 50-100k bricks that you only partially use; this way the memory won't get as fragmented. But garbage collection is a lot better and this would typically be his job.
Last resort
When your memory is about halfway exhausted - which you can check by calling memory_get_usage(true) - serialize the big structure you're using, unset the variable, then unserialize it back. This should sort out the allocation problems.
Hope some of the above helps.
I have a PHP program that will run forever (not a webpage a socket server). After processing over 1000 requests the program eventually crashes due to an out of memory exception.
Here is a link to my project.
Here is a link to my program.
I am not sure why this happens, I have tried using garbage collection functions in the function that processes requests (onMessage) in the program but it does not result in any changes. Any suggestions would be appreciated.
Investing huge amounts of effort, you may be able to mitigate this for a while. But in the end you will have trouble running a non-terminating PHP application.
Check out PHP is meant to die. This article discusses PHP's memory handling (among other things) and specifically focuses on why all long-running PHP processes eventually fail. Some excerpts:
There’s several issues that just make PHP the wrong tool for this. Remember, PHP will die, no matter how hard you try. First and foremost, there’s the issue of memory leaks. PHP never cared to free memory once it’s not used anymore, because everything will be freed at the end — by dying. In a continually-running process, that will slowly keep increasing the allocated memory (which is, in fact, wasted memory), until reaching PHP’s memory_limit value and killing your process without a warning. You did nothing wrong, except expecting the process to live forever. Under load, replace the “slowly” part for "pretty quickly".
There’s been improvements in the “don’t waste memory” front. Sadly, they’re not enough. As things get complex or the load increases, it’ll crash.
I decided to take a look at how much memory was being allocated to a few of my PHP scripts, and found it to be peaking at about 130KiB. Not bad, I thought, considering what was going on in the script.
Then, I decided to see what the script started at. I expected something around 32KiB.
I got 121952 bytes instead. After that, I tried testing a completely devoid script:
<?php
echo memory_get_usage();
It also started with the same amount of memory allocated.
Now, obviously, PHP is going to allocate some memory to the script before it is run, but this seems a bit excessive.
However, it doesn't seem to be dynamic at all based on how much memory is available to the system at the time. I tried consuming more system memory by opening other processes, but the pre-allocated memory amount stayed the same exact number of bytes.
Is this at all configurable on a per script basis, and how does PHP determine how much it will allocate to each script?
Using PHP Version 5.4.7
Thanks.
The memory_get_usage function directly queries PHP's memory allocator to get this information. It reports how much memory is used by PHP itself, not how much the whole process or even the system as a whole is using.
If you do not pass in an additional true argument what you get back is the exact amount of memory that your code uses; this will never be more than what memory_get_usage(true) reports.
If you do call memory_get_usage(true) you will get back the size of the heap the allocator has reserved from the system, which includes memory that has not been actually used by your code but is directly available to your code.
When your script needs more memory than what is already available to the allocator, the latter will reserve another big chunk from the OS and you will see memory_get_usage(true) jump up sharply while memory_get_usage() might only increase by a few bytes.
The exact strategy the allocator uses to decide when and how much memory to allocate is baked into PHP at compilation time; you would have to edit the source and compile PHP to change this behavior.
I'm trying to track down a memory leak in a PHP Program (Magento, if it matters). The basic problem seems to be a leak in some object/class that's growing over time. That is, the more information that gets logged to the database, the more memory certain application processes end up using. Magento's a highly abstract system, so it's not always clear what code is being run that's consuming so much memory. That's what I'm trying to track down.
I've been using memory_get_peak_usage at the end of the program bootstrap file to benchmark performance, and seen a steady growth from 250MB of peak use, to 310MB of peak use in about a week. I would like to use memory_get_peak_usage intermittently throughout the execution cycle to ask
What was the peak usage prior to this call? [later in the cycle] What was the peak usage prior to this new call?
The problem I'm running into is, once I call memory_get_peak_usage once, any future call returns the same value as the first call, even when I know the peak usage has changed. This leads me to believe that after memory_get_peak_usage is called once, PHP caches the result. I would like to uncache it to perform the testing outlined above.
Can I call memory_get_peak_usage multiple times?
Are there alternative to profiling the scenario I've described above. Some feature of xDebug maybe?
Can I call memory_get_peak_usage multiple times?
Not sure on that one.
Are there alternative to profiling the scenario I've described above. Some feature of xDebug maybe?
Have a look at the XDebug profile page. It's been awhile since I have profiled an app, but when I did I followed the write-up and worked great.