Memory usage of php process - php

SUMMARY
Short recomendations (from more datailed informations, see answers)
To avoid memory leaks you can:
unset variables at once when they become useless
you can use xdebug for detailed report of memory consumption by functions and find memory leaks
you can set memory_limit (for example to 5Mb) to avoid dummy memory allocation
QUESTION
For what php can use memory, except libraries and variables?
I monitor memory, used by variables and its ~ 3Mb with this code:
$vars = array_keys(get_defined_vars());
$cnt_vars = count($vars);
$allsize = 0;
for ($j = 0; $j < $cnt_vars; $j++) {
try
{
$size = #serialize($$vars[$j]);
$size = strlen($size);
}
catch(Exception $e){
$str = json_encode($$vars[$j]);
$str = str_replace(array('{"','"}','":"','":'), '', $str);
$size = strlen($str);
}
$vars[$j] = array(
'size' => $size,
'name' => $vars[$j]
);
$allsize += $size;
}
and libraries takes ~ 18Mb (libcurl, etc.)
So total its 21 Mb, but
pmap -x (process)
shows, that total memory consumption is kB: 314028 RSS: 74704 Dirty: 59672
so, total real consumption is ~74Mb.
Also i see some large blocks with [anon] mapping in my pmap
For what PHP using this blocks?
php version: 5.5.9-1ubuntu4.14
php extensions:
root#webdep:~# php -m
[PHP Modules]
bcmath
bz2
calendar
Core
ctype
curl
date
dba
dom
ereg
exif
fileinfo
filter
ftp
gd
gettext
hash
iconv
json
libxml
mbstring
mcrypt
mhash
openssl
pcntl
pcre
PDO
pdo_pgsql
pgsql
Phar
posix
readline
Reflection
session
shmop
SimpleXML
soap
sockets
SPL
standard
sysvmsg
sysvsem
sysvshm
tokenizer
wddx
xml
xmlreader
xmlwriter
Zend OPcache
zip
zlib
[Zend Modules]
Zend OPcache

PHP is not same as C or CPP code that compiles to single binary. All your scripts are executed inside Zend Virtual Machine. And most of the memory is consumed by VM itself. That includes the memory used by loaded extensions the shared libraries (.so files) used by PHP process and any other shared resources.
I don't remember the exact source but somewhere I read that nearly 70% of total CPU cycles are consumed by PHP internals and only 30% get to your code (Please correct me if I am wrong here). This is not directly related with Memory consumption but should give an idea about how PHP works.
About anon blocks I found some details in another SO answer. The answer is about Java but same should apply to PHP as well.
Anon blocks are "large" blocks allocated via malloc or mmap -- see the
manpages. As such, they have nothing to do with the Java heap (other
than the fact that the entire heap should be stored in just such a
block).
Here's the actual link for SO answer
https://stackoverflow.com/a/1483482/1012809
Check this article for more details on anonymous memory pages (anon)
https://techtalk.intersec.com/2013/07/memory-part-2-understanding-process-memory/
Also check this Slide-share for more details on PHP memory management
http://www.slideshare.net/jpauli/understanding-php-memory
I would recommend to disable some extensions. That should save you some unused memory.

NOTE: this is not exactly an answer but information requested by the OP, but the comment field is too short for this... These are more of tools how to debug this kind of problems.
Xdebug’s docs are pretty comprehensive, they should tell how to use it far better than I could by copying their docs to here. The script you gave is a bit fuzzy, so I did not do the trace myself, but it would give you line-by-line diffs of memory usage.
Basically set xdebug.show_mem_delta to 1 with Xdebug enabled to generate the function trace, which you can then open in a text editor to see what part exactly is the thing that leaks memory.
Then you can compare the initial (or middle position) total memory to see how much it differs from the real memory usage you are seeing.
TRACE START [2007-05-06 14:37:26]
0.0003 114112 +114112 -> {main}() ../trace.php:0
Here the total memory would be the 114112.
If the difference is really big, you may want to use something like shell_exec() to get the real memory usage in between all lines, and output that, and then you can compare that output to Xdebug’s memory output to see where the difference happens.
If the difference is from the very first line of the script, the culprit could be an extension of PHP. See php -m if there is any fishy extensions.

First of all make an array, to investigate memory it is taking
$startMemory = memory_get_usage();
$array = range(1, 100000);
echo memory_get_usage() - $startMemory, ' bytes';
one integer is 8 bytes (on a 64 bit unix machine and using the long type) and here 100000 integers, so you obviously will need 800000 bytes. That’s something like 0.76 MB.
This array gives 14649024 bytes. That’s 13.97 MB - eighteen times more than estimated.
Here is a quick summary of the memory usage of the different components involved:
| 64 bit | 32 bit
---------------------------------------------------
zval | 24 bytes | 16 bytes
+ cyclic GC info | 8 bytes | 4 bytes
+ allocation header | 16 bytes | 8 bytes
===================================================
zval (value) total | 48 bytes | 28 bytes
===================================================
bucket | 72 bytes | 36 bytes
+ allocation header | 16 bytes | 8 bytes
+ pointer | 8 bytes | 4 bytes
===================================================
bucket (array element) total | 96 bytes | 48 bytes
===================================================
total total | 144 bytes | 76 bytes
Again, for large, static arrays, if i call like:
$startMemory = memory_get_usage();
$array = new SplFixedArray(100000);
for ($i = 0; $i < 100000; ++$i) {
$array[$i] = $i;
}
echo memory_get_usage() - $startMemory, ' bytes';
It will result 5600640 bytes
That’s 56 bytes per element and thus much less than the 144 bytes per element a normal array uses. This is because a fixed array doesn’t need the bucket structure. So it only requires one zval (48 bytes) and one pointer (8 bytes) for each element, giving the observed 56 bytes.
Hope this will be helpful.

Nothing is wrong with the numbers you see, you shouldn't combine them, this is just "tripling", you are seeing different sections (read-only, executable, writable) for libraries listed separately, your number is correct.

Related

problems with lz4 between php and golang

I try to compress data with lz4_compress in php and uncompress data with https://github.com/pierrec/lz4 in golang
but it fails.
it seems that the lz4_compress output misses the lz4 header, and the block data is little different.
please help me solve the problem.
<?php
echo base64_encode(lz4_compress("Hello World!"));
?>
output:
DAAAAMBIZWxsbyBXb3JsZCE=
package main
import (
"bytes"
"encoding/base64"
"fmt"
"github.com/pierrec/lz4"
)
func main() {
a, _ := base64.StdEncoding.DecodeString("DAAAAMBIZWxsbyBXb3JsZCE=")
fmt.Printf("%b\n", a)
buf := new(bytes.Buffer)
w := lz4.NewWriter(buf)
b := bytes.NewReader([]byte("Hello World!"))
w.ReadFrom(b)
fmt.Printf("%b\n", buf.Bytes())
}
output:
[1100 0 0 0 11000000 1001000 1100101 1101100 1101100 1101111 100000 1010111 1101111 1110010 1101100 1100100 100001]
[100 100010 1001101 11000 1100100 1110000 10111001 1100 0 0 10000000 1001000 1100101 1101100 1101100 1101111 100000 1010111 1101111 1110010 1101100 1100100 100001]
lz4.h explicitly says
lz4.h provides block compression functions. It gives full buffer control to user.
Decompressing an lz4-compressed block also requires metadata (such as compressed size). Each application is free to encode such metadata in whichever way it wants.
An additional format, called LZ4 frame specification (doc/lz4_Frame_format.md),
take care of encoding standard metadata alongside LZ4-compressed blocks. If your application requires interoperability, it's recommended to use it. A library is provided to take care of it, see lz4frame.h.
The PHP extension doesn't do that; it produces bare compressed blocks.
http://lz4.github.io/lz4/ explicitly lists the PHP extension as not interoperable (in the "Customs LZ4 ports and bindings" section).
Sound good! And now try
echo -n DAAAAMBIZWxsbyBXb3JsZCE= | base64 -d
I got in first 4 bytes is written 0C 00 00 00 - that is the lenght of string and rest is Hello World!. Therefore I think, that if php realize that compression of such a short input is not possible it writes the input (try echo -n "Hello World!" | lz4c ). But problem is it does not allow you recognize such a thing, or I'm wrong?

PHP XML Memory Leak?

We have a severe memory leak in one of our regularly run scripts that quickly wipes out the free memory on the server. Despite many hours of research and experiments, though, I've been unable to even make a dent in it.
Here is the code:
echo '1:'.memory_get_usage()."\n";
ini_set('memory_limit', '1G');
echo '2:'.memory_get_usage()."\n";
$oXML = new DOMDocument();
echo '3:'.memory_get_usage()."\n";
$oXML->load('feed.xml'); # 556 MB file
echo '4:'.memory_get_usage()."\n";
$xpath = new DOMXPath($oXML);
echo '5:'.memory_get_usage()."\n";
$oNodes = $xpath->query('//feed/item'); # 270,401 items
echo '6:'.memory_get_usage()."\n";
unset($xpath);
echo '7:'.memory_get_usage()."\n";
unset($oNodes);
echo '8:'.memory_get_usage()."\n";
unset($oXML);
echo '9:'.memory_get_usage()."\n";
And here is the output:
1:679016
2:679320
3:680128
4:680568
5:681304
6:150852408
7:150851840
8:34169968
9:34169448
As you can see, when we use xpath to load the nodes into an object, memory usage jumps from 681,304 to 150,852,408. I'm not terribly concerned about that.
My problem is that even after destroying the $oNodes object, we're still stuck at memory usage of 34,169,968.
But the real problem is that the memory usage that PHP shows is a tiny fraction of the total memory eaten by the script. Using free -m directly from the command line on the server, we go from 3,295 MB memory used to 5,226 MB -- and it never goes back down. We're losing 2 GB of memory every time this script runs, and I am at a complete loss as to why or how to fix it.
I tried using SimpleXML instead, but the results were basically identical. I also studied these three threads but didn't find anything in them that helped:
XML xpath search and array looping with php, memory issue
DOMDocument / Xpath leaking memory during long command line process - any way to deconstruct this class
DOMDocument PHP Memory Leak
I'm hoping this is something easy that I'm just overlooking.
UPDATE 11/10: It does appear that memory is eventually freed up. I noticed that after a little more than 30 minutes, suddenly a big block came free again. Obviously, though, that hasn't been nearly fast enough recently to keep the server from running out of memory and locking up.
And for what it's worth, we're running PHP 5.3.15 with Apache 2.2.3 on Red Hat 5.11. We're working to update to the latest versions of all of those, so somewhere along that upgrade path, we might find this fixed. It would be great to do it before then, though.
Recently experienced a issue just like yours. We needed to extract data from a 3gb xml file and also noticed that server memory was reaching its limits. There are several ways you can decrease the memory usage;
instead of using xpath which causes the great amount of memory usage use (for example) file_get_contents. Then do a search via regular expression to find desired data
split the xml into smaller pieces. Basicly its reinventing the xml file, however you can handle the maximum sizes for the files (thus memory)
You mentioned that after 30 minutes some memory was released. Reading a 500mb xml over 30 minutes is way to slow. The solution we used is splitting up the 3gb xml file into several pieces (aprox 200). Our script writes the required data(around 700k records) to our database in less then 5 minutes.
We just experienced a similar issue with PHPDocxPro (which uses DomDocument) and submitted a patch to them that at least improves upon the problem. The memory usage reported by get_memory_usage() never increased, as though PHP was not aware of the allocation at all. The memory reported while watching execution via top or ps is what we were more concerned about.
// ps reports X memory usage
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + Y memory usage
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + ~2Y memory usage
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + ~3Y memory usage
Adding an unset() before each subsequent call...
// ps reports X memory usage
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + Y memory usage
unset($foo);
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + ~Y memory usage
unset($foo);
var $foo = (new DomDocument())->loadXML(getSomeXML());
// ps reports X + ~Y memory usage
I haven't dug into the extension code to understand what's going on, but my guess is that they're allocating memory without using PHP's allocation, and as such, it's not being counted as part of the heap that get_memory_usage() considers. Despite this, there does appear to be some reference counting to determine whether or not memory can be freed. The unset($foo) before a subsequent call makes sure that the extension can reuse some resources. Without that, memory usage increases every time the code is run.

PHP Internal Memory Bloat

I have a need to process a some large files say 50MB each. I have found that PHP functions use up large portions of memory. In the below example the memory used by PHP's functions ends up being four (4) times the file size. I can understand the transient usage of twice the memory size of the file, but not four times. In the end PHP blows out the memory_limit. While I can increase the PHP memory_limit it is not a good long term solution as I may have to process larger files, and in a production environment having PHP gobble up 400MB per process is not desirable.
Code:
$buf = '';
report_memory(__LINE__);
$buf = file_get_contents('./20MB.pdf');
report_memory(__LINE__);
base64_encode($buf);
report_memory(__LINE__);
urlencode($buf);
report_memory(__LINE__);
function report_memory($line=0) {
echo 'Line: ' . str_pad($line,3) . ' ';
echo 'Mem: ' . str_pad(intval(memory_get_usage()/1024 ) . 'K',8) . ' ';
echo 'Peak: ' . str_pad(intval(memory_get_peak_usage()/1024) . 'K',8) . ' ';
echo "\n";
}
Output:
Line: 4 Mem: 622K Peak: 627K
Line: 7 Mem: 21056K Peak: 21074K
Line: 10 Mem: 21056K Peak: 48302K
Line: 13 Mem: 21056K Peak: 82358K
One can see that for a 20MB file the current memory usage hovers at 21MB, while the peak memory usage jumps up to an insane 82MB.
The PHP functions used in the example are arbitrary, I can easily swap in str_replace, is_string, gettype, etc with the same results.
The question is how can I keep PHP from doing this?
The environment is CentOS 6.6 running a stock PHP 5.3.3.
Thanks for any insight.
You're url-encoding. Given that your PDF is basically "random" binary garbage, MANY of the bytes in there are non-printable. That means you're going from a one byte "binary" character to 3+ byte URL-encoded string. Given you've got a 20meg PDF, it's no surprise that tripling the amount of text in there is going to bloat your memory. Remember that PHP has to keep TWO copies of your PDF while it's working: the original "raw" version, and the working copy of whatever transform you're doing on it.
Assuming a worst-case "every single character gets encoded", your 20meg PDF will convert to a 60meg url-encoded string, causing a 20+60 = 80 meg peak usage, even though that 60meg encoded version is immediately tossed away.

PhpExcel stops working after setting 20 cell types

I have a script that generates a little xls table (~25x15). It contains percentages and strings and i have an if operator that sets the current cell as percentage type with this code:
$this->objPHPExcel->getActiveSheet()->getStyle($coords)->getNumberFormat()->setFormatCode('0.00%');
But when i export and look at the file I see it only managed to set type and style about 20 cells. And all the rest are with default settings. I debugged it and realized the problem isn't in my logic. I read about increasing php cache memory - tried it but it didn't work. Please help because i need to export at least 15 times larger table. Thanks in advance!
PHPExcel allocates quite some memory
While PHPExcel is a beautiful library, using it may require huge amounts of memory allocated to PHP.
According to this thread, just 5 cells may render 6 MByte of memory usage:
<?php
require_once 'php/PHPExcelSVN/PHPExcel/IOFactory.php';
$objPHPExcel = PHPExcel_IOFactory::load("php/tinytest.xlsx");
$objPHPExcel->setActiveSheetIndex(0);
$objPHPExcel->getActiveSheet()->setCellValue('D2', 50);
echo $objPHPExcel->getActiveSheet()->getCell('D8')->getCalculatedValue() . "
";
echo date('H:i:s') . " Peak memory usage: " . (memory_get_peak_usage(true) / 1024 / 1024) . " MB\r\n";
?>
I get 6MB of memory usage.
Another user even failed with a 256MByte memory setting.
While PHPExcel provides ways to reduce its memory footprint, all reductions turned out to be too small in my case. This page on github provides details of PHPExcel's cache management options. For example, this setting serializes and the GZIPs the cell-structure of a worksheet:
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_in_memory_gzip;
PHPExcel_Settings::setCacheStorageMethod($cacheMethod);
PHPExcel's FAQ explains this:
Fatal error: Allowed memory size of xxx bytes exhausted (tried to
allocate yyy bytes) in zzz on line aaa
PHPExcel holds an "in memory" representation of a spreadsheet, so it
is susceptible to PHP's memory limitations. The memory made available
to PHP can be increased by editing the value of the memorylimit
directive in your php.ini file, or by using iniset('memory_limit',
'128M') within your code (ISP permitting);
Some Readers and Writers are faster than others, and they also use
differing amounts of memory. You can find some indication of the
relative performance and memory usage for the different Readers and
Writers, over the different versions of PHPExcel, here
http://phpexcel.codeplex.com/Thread/View.aspx?ThreadId=234150
If you've already increased memory to a maximum, or can't change your
memory limit, then this discussion on the board describes some of the
methods that can be applied to reduce the memory usage of your scripts
using PHPExcel
http://phpexcel.codeplex.com/Thread/View.aspx?ThreadId=242712
Measurement results for PHP Excel
I instrumented the PHPExcel example file [01simple.php][5] and did some quick testing.
Consume 92 KByte:
for( $n=0; $n<200; $n++ ) {
$objPHPExcel->setActiveSheetIndex(0) ->setCellValue('A' . $n, 'Miscellaneous glyphs');
}
Consumes 4164 KBytes:
for( $n=0; $n<200; $n++ ) {
$objPHPExcel->setActiveSheetIndex(0) ->setCellValue('A' . $n, 'Miscellaneous glyphs');
$objPHPExcel->getActiveSheet()->getStyle('A' . $n)->getAlignment()->setWrapText(true);
}
If one executes this fragment several times unchanged, each fragment consumes around 4 MBytes.
Checking your app is logically correct
To ensure, that your app is logically correct, I'd propose to increase PHP memory first:
ini_set('memory_limit', '32M');
In my case, I have to export to export result data of an online assessment application. While there are less than 100 cells horizontally, I need to export up to several 10.000 rows. While the amount of cells was big, each of my cells holds a number or a string of 3 characters - no formulas, no styles.
In case of strong memory restrictions or large spreadsheets
In my case, none of the cache options reduced the amount as much as required. Plus, the runtime of the application grew enormously.
Finally I had to switch over to old fashioned CSV-data file exports.
I ran your code locally and found that all the 26 cells you set to this percentage had the right format and a % sign. I had to uncomment lines 136-137 first, of course.
This must be related to your setup. I cannot imagine you'd have too little memory for a spreadsheet of this size.
For your information, I confirmed it worked on PHP Version 5.4.16 with php excel version version 1.7.6, 2011-02-27. I opened the spreadsheet with MS Excel 2007.
<?php
$file = 'output_log.txt';
function get_owner($file)
{
$stat = stat($file);
$user = posix_getpwuid($stat['uid']);
return $user['name'];
}
$format = "UID # %s: %s\n";
printf($format, date('r'), get_owner($file));
chown($file, 'ross');
printf($format, date('r'), get_owner($file));
clearstatcache();
printf($format, date('r'), get_owner($file));
?>
clearstatcache(); can be useful. Load this function at the start of the php page.

PHP opcode memory hog during include?

While optimizing a site for memory, I noticed a leap in memory consumption while including a large number of PHP class files (600+) for a specific purpose. Taking things apart I noticed that including a PHP file (and thus presumably compiling to opcodes) takes about about 50 times more memory than the filesize on disk.
In my case the files on disk are together around 800 kB in size (with indentation and comments, pure class declarations, not many strings), however after including them all, memory consumption was around 40 MB higher.
I measured like this (PHP 5.3.6):
echo memory_get_usage(), "<br>\n";
include($file);
echo memory_get_usage(), "<br>\n";
Within a loop over the 600 files I can watch memory consumption grow from basically zero to 40 MB. (There is no autoloader loading additonal classes, or any global code or constructor code that is executed immediately, it's really the pure include only.)
Is this normal behaviour? I assumed opcodes are more compact than pure source code (stripping out all spaces and comments, or having for example just one or two instruction bytes instead of a "foreach" string etc.)?
If this is normal, is there a way to optimize it? (I assume using an opcode cache would just spare me the compile time, not the actual memory consumption?)
Apparently that's just the way it is.
I've retested this from the ground up:
Include an empty zero length file: 784 bytes memory consumption increase
Include an empty class X { } definition: 2128 bytes
Include a class with one empty method: 2816 bytes
Include a class with two empty methods: 3504 bytes
The filesize of the include file is under 150 bytes in all tests.

Categories