Free PDO statement result from memory in loop - php

I am using the following code in an application based on ZF1:
$select = $db->select()->from('table', array('id', 'int', 'float'))->limit(10000, (($i - 1) * 10000));
$data = $select->query();
while ($row = $data->fetch()) {
# ...
}
This operation is happening in a foreach loop for some 800 times. I output the memory usage for each pass and can see it increasing by about 5MB per pass. I suppose that is because Zend apparently does not free the result from the query once the pass is complete. A simple unset didn't solve the issue. Using fetchAll also did not improve (or change) the situation.
Is there any way to free the result from a Zend_Db_Statement_PDO thus freeing the memory used by it? Or do you suspect another reason?

I believe you want to do this:
$sql = "SELECT something FROM random-table-with-an-obscene-large-amount-of-entries";
$res = $db->query($sql);
while ($row = $res->fetch(Zend_Db::FETCH_NUM)) {
// do some with the data returned in $row
}
Zend_Db::FETCH_NUM - return data in an array of arrays. The arrays are indexed by integers, corresponding to the position of the respective field in the select-list of the query.
Since you overwrite $row on each loop, the memory should be reclaimed. If you are paranoid you can unset($row) at the bottom of the loop I believe. I've not tested this myself recently, but I ran into a batch problem about a year ago that was similar, and I seem to recall using this solution.

Actually the problem was hidden somewhere else:
Inside the loop some integer results were stored in an array for modification at a later planned stage in the workflow.
While one might expect PHP arrays to be small, that is not the case: Arrays grow big really fast and a PHP array is on average 18 times larger than it is to be 'expected'. Watch out while working with arrays, even if you only store integers in them!
In case the linked article disappears sometime:
In this post I want to investigate the memory usage of PHP arrays (and values in general) using the following script as an example, which creates 100000 unique integer array elements and measures the resulting memory usage:
$startMemory = memory_get_usage();
$array = range(1, 100000);
echo memory_get_usage() - $startMemory, ' bytes';
How much would you expect it to be? Simple, one integer is 8 bytes (on a 64 bit unix machine and using the long type) and you got 100000 integers, so you obviously will need 800000 bytes. That’s something like 0.76 MBs.
Now try and run the above code. This gives me 14649024 bytes. Yes, you heard right, that’s 13.97 MB - eightteen times more than we estimated.

Related

Is there a way to estimate how much memory will PHP array consume?

Suppose that you have a query and you know exactly how many results it will return and length of every column, e.g. 1M records int `id`, char(100) `uid`.
Is there a way to know in advance how much memory will PHP need to allocate for this associative array?
Just don't allocate 1M records as an associative array. Select only data you need.
Best way to find out would be to do something like:
$numItems = 1000000;
$bytes = memory_get_usage();
$arr = array();
$arrBytes = memory_get_usage() - $bytes;
$arr[] = array('id'=>1,'uid'=>str_repeat("a",100));
$itemBytes = memory_get_usage() - $bytes - $arrBytes;
echo "Total memory usage estimate: ".($arrBytes + ($itemBytes * $numItems));
Disclaimer: I'm not completely sure how accurate this will be but should give a ballpark. This will also vary on different systems (i.e. 32bit ints versus 64 bit ints) and different encodings (i.e. single byte versus multi byte).
Having said all this (and as #YourCommonSense noted). The answer to "how much memory will a million item array consume" is "too much". Consider paginating or the like to reduce the number of items you need.

Amount of data stored in a PHP array

I'm looking for a way to measure the amount of data stored in a PHP array. I'm not talking about the number of elements in the array (which you can figure out with count($array, COUNT_RECURSIVE)), but the cumulative amount of data from all the keys and their corresponding values. For instance:
array('abc'=>123); // size = 6
array('a'=>1,'b'=>2); // size = 4
As what I'm interested in is order of magnitude rather than the exact amount (I want to compare the processing memory and time usage versus the size of the arrays) I thought about using the following trick:
strlen(print_r($array,true));
However the amount of overhead coming from print_r varies depending on the structure of the array which doesn't give me consistent results:
echo strlen(print_r(array('abc'=>123),true)); // 27
echo strlen(print_r(array('a'=>1,'b'=>2),true)); // 35
Is there a way (ideally in a one-liner and without impacting too much performance as I need to execute this at run-time on production) to measure the amount of data stored in an array in PHP?
Does this do the trick:
<?php
$arr = array('abc'=>123);
echo strlen(implode('',array_keys($arr)).implode('',$arr));
?>
Short answer: mission impossible
You could try something like:
strlen(serialize($myArray)) // either this
strlen(json_encode($myArray)) // or this
But to approximate the true memory footprint of an array, you will have to do a lot more than that. If you're looking for a ballpark estimate, arrays take 3-8x more than their serialized version, depending on what you store in them and how many elements you have. It increases gradually, in bigger and bigger chunks as your array grows. To give you an idea of what's happening, here's an array estimation function I came up with, after many hours of trying, for one-level arrays only:
function estimateArrayFootprint($a) { // copied from one of my failed quests :(
$size = 0;
foreach($a as $k=>$v) {
foreach([$k,$v] as $x) {
$n = strlen($x);
do{
if($n>8192 ) {$n = (1+($n>>12)<<12);break;}
if($n>1024 ) {$n = (1+($n>> 9)<< 9);break;}
if($n>512 ) {$n = (1+($n>> 8)<< 8);break;}
if($n>0 ) {$n = (1+($n>> 5)<< 5);break;}
}while(0);
$size += $n + 96;
}
}
return $size;
}
So that's how easy it is, not. And again, it's not a reliable estimation, it probably depends on the PHP memory limit, the architecture, the PHP version and a lot more. The question is how accurately do you need this value.
Also let's not forget that these values came from a memory_get_usage(1) which is not very exact either. PHP allocates memory in big blocks in order to avoid a noticeable overhead as your string/array/whatever else grows, like in a for(...) $x.="yada" situation.
I wish I could say anything more useful.

Memory optimization in PHP array

I'm working with a large array which is a height map, 1024x1024 and of course, i'm stuck with the memory limit. In my test machine i can increase the mem limit to 1gb if i want, but in my tiny VPS with only 256 ram, it's not an option.
I've been searching in stack and google and found several "well, you are using PHP not because memory efficiency, ditch it and rewrite in c++" and honestly, that's ok and I recognize PHP loves memory.
But, when digging more inside PHP memory management, I did not find what memory consumes every data type. Or if casting to another type of data reduces mem consumption.
The only "optimization" technique i found was to unset variables and arrays, that's it.
Converting the code to c++ using some PHP parsers would solve the problem?
Thanks!
If you want a real indexed array, use SplFixedArray. It uses less memory. Also, PHP 5.3 has a much better garbage collector.
Other than that, well, PHP will use more memory than a more carefully written C/C++ equivalent.
Memory Usage for 1024x1024 integer array:
Standard array: 218,756,848
SplFixedArray: 92,914,208
as measured by memory_get_peak_usage()
$array = new SplFixedArray(1024 * 1024); // array();
for ($i = 0; $i < 1024 * 1024; ++$i)
$array[$i] = 0;
echo memory_get_peak_usage();
Note that the same array in C using 64-bit integers would be 8M.
As others have suggested, you could pack the data into a string. This is slower but much more memory efficient. If using 8 bit values it's super easy:
$x = str_repeat(chr(0), 1024*1024);
$x[$i] = chr($v & 0xff); // store value $v into $x[$i]
$v = ord($x[$i]); // get value $v from $x[$i]
Here the memory will only be about 1.5MB (that is, when considering the entire overhead of PHP with just this integer string array).
For the fun of it, I created a simple benchmark of creating 1024x1024 8-bit integers and then looping through them once. The packed versions all used ArrayAccess so that the user code looked the same.
mem write read
array 218M 0.589s 0.176s
packed array 32.7M 1.85s 1.13s
packed spl array 13.8M 1.91s 1.18s
packed string 1.72M 1.11s 1.08s
The packed arrays used native 64-bit integers (only packing 7 bytes to avoid dealing with signed data) and the packed string used ord and chr. Obviously implementation details and computer specs will affect things a bit, but I would expect you to get similar results.
So while the array was 6x faster it also used 125x the memory as the next best alternative: packed strings. Obviously the speed is irrelevant if you are running out of memory. (When I used packed strings directly without an ArrayAccess class they were only 3x slower than native arrays.)
In short, to summarize, I would use something other than pure PHP to process this data if speed is of any concern.
In addition to the accepted answer and suggestions in the comments, I'd like to suggest PHP Judy array implementation.
Quick tests showed interesting results. An array with 1 million entries using regular PHP array data structure takes ~200 MB. SplFixedArray uses around 90 megabytes. Judy uses 8 megs. Tradeoff is in performance, Judy takes about double the time of regular php array implementation.
A little bit late to the party, but if you have a multidimensional array you can save a lot of RAM when you store the complete array as json.
$array = [];
$data = [];
$data["a"] = "hello";
$data["b"] = "world";
To store this array just use:
$array[] = json_encode($data);
instead of
$array[] = $data;
If you want to get the arrry back, just use something like:
$myData = json_decode($array[0], true);
I had a big array with 275.000 sets and saved about 36% RAM consumption.
EDIT:
I found a more better way, when you zip the json string:
$array[] = gzencode(json_encode($data));
and unzip it when you need it:
$myData = json_decode(gzdecode($array[0], true));
This saved me nearly 75% of RAM peak usage.

More than 640 000 elements in the array - memory problem [Dijkstra]

I have a script which puts 803*803 (644 809) graph with 1 000 000 value inside each. With ~500*500 everything works fine - but now it crashes - it tries to allocate more than 64MB of memory (which I haven't). What's the solution? Somehow "split" it or...?
$result=mysql_query("SELECT * FROM some_table", $connection);
confirm($result);
while($rows = mysql_fetch_array($result)){
$result2=mysql_query("SELECT * FROM some_table", $connection);
confirm($result2);
while($rows2 = mysql_fetch_array($result2)){
$first = $rows["something"];
$second = $rows2["something2"];
$graph[$first][$second] = 1000000;
}
}
*it's about Dijkstra algorithm
p.s. no, I can't allocate more than 64MB
Try freeing your inside sql result at the end of each loop, using mysql_free_result($result2);, the PHP script may not do it for you, depending on the PHP version (the garbage collector may not be enabled or may be useless due to a too old PHP version).
Do not instanciate the two temporary variables inside the loop, use the mysql_fetch_array result directly such as $graph[$rows["something"]][$rows2["something2"]] = 1000000; , you will save 2 memory allocations per loop..
PS: This is micro-optimization, therefore it may help you to save enough memory to fit into your 64M of memory. Don't forget that with 64 * 1024 * 1024 bytes of memory, you have an average 104 bytes maximum size for each of your 644 809 elements, plus the array size itself, plus the rest of the temporary data you may allocate for your algorithm.
If it doesn't fit, consider splitting your matrix and doing batched jobs or such to split your work in lesser memory consuming but more than one script run.
If your above code sample actually matches your real code, you're fetching the same result two times (the second one even in a loop). If it's the same dataset fetching it once from the database should be enough and will reduce database-load, execution time and memory footprint altogether.
Perhaps the following approach may work in your memory-restricted environment.
$result = mysql_unbuffered_query("SELECT * FROM some_table", $connection);
confirm($result);
$rawData = array();
while ($rows = mysql_fetch_assoc($result)) {
$rawData[] = array($rows["something"], $rows["something2"]);
}
mysql_free_result($result);
$graph = array();
foreach ($rawData as $r1) {
foreach ($rawData as $r2) {
$graph[$r1[0]][$r2[1]] = 1000000;
}
}
unset($rawData);
Notes:
I'm using mysql_fetch_assoc() instead of mysql_fetch_array() because the latter will return every column twice (one numerically indexed and one indexed by the column name)
Perhaps using mysql_unbuffered_query() instead of mysql_query() might also reduce the memory footprint (depending on the actual dataset size)
Try using http://en.wikipedia.org/wiki/Adjacency_list to represent the graph instead of Adjacency Matrix ( i think you are using matrix cause of $graph[$first][$second] = 1000000;
For a sparse graph, it takes less memory.
If you insist on using PHP for high memory operations (which is not really a good idea to begin with), I would partition the graph into quadrants, and use GD to combine the quadrants. That way you will only have to build the graph with 1/4 the memory footprint.
Again, this is not ideal, but you're trying to use a nail to drive in a hammer :D

serialize a large array in PHP?

I am curious, is there a size limit on serialize in PHP. Would it be possible to serialize an array with 5,000 keys and values so it can be stored into a cache?
I am hoping to cache a users friend list on a social network site, the cache will need to be updated fairly often but it will need to be read almost every page load.
On a single server setup I am assuming APC would be better then memcache for this.
As quite a couple other people answered already, just for fun, here's a very quick benchmark (do I dare calling it that ? ) ; consider the following code :
$num = 1;
$list = array_fill(0, 5000, str_repeat('1234567890', $num));
$before = microtime(true);
for ($i=0 ; $i<10000 ; $i++) {
$str = serialize($list);
}
$after = microtime(true);
var_dump($after-$before);
var_dump(memory_get_peak_usage());
I'm running this on PHP 5.2.6 (the one bundled with Ubuntu jaunty).
And, yes, there are only values ; no keys ; and the values are quite simple : no object, no sub-array, no nothing but string.
For $num = 1, you get :
float(11.8147978783)
int(1702688)
For $num = 10, you get :
float(13.1230671406)
int(2612104)
And, for $num = 100, you get :
float(63.2925770283)
int(11621760)
So, it seems the bigger each element of the array is, the longer it takes (seems fair, actually). But, for elements 100 times bigger, you don't take 100 times much longer...
Now, with an array of 50000 elements, instead of 5000, which means this part of the code is changed :
$list = array_fill(0, 50000, str_repeat('1234567890', $num));
With $num = 1, you get :
float(158.236332178)
int(15750752)
Considering the time it took for 1, I won't be running this for either $num = 10 nor $num = 100...
Yes, of course, in a real situation, you wouldn't be doing this 10000 times ; so let's try with only 10 iterations of the for loop.
For $num = 1 :
float(0.206310987473)
int(15750752)
For $num = 10 :
float(0.272629022598)
int(24849832)
And for $num = 100 :
float(0.895547151566)
int(114949792)
Yeah, that's almost 1 second -- and quite a bit of memory used ^^
(No, this is not a production server : I have a pretty high memory_limit on this development machine ^^ )
So, in the end, to be a bit shorter than those number -- and, yes, you can have numbers say whatever you want them to -- I wouldn't say there is a "limit" as in "hardcoded" in PHP, but you'll end up facing one of those :
max_execution_time (generally, on a webserver, it's never more than 30 seconds)
memory_limit (on a webserver, it's generally not muco more than 32MB)
the load you webserver will have : while 1 of those big serialize-loop was running, it took 1 of my CPU ; if you are having quite a couple of users on the same page at the same time, I let you imagine what it will give ;-)
the patience of your user ^^
But, except if you are really serializing long arrays of big data, I am not sure it will matter that much...
And you must take into consideration the amount of time/CPU-load using that cache might help you gain ;-)
Still, the best way to know would be to test by yourself, with real data ;-)
And you might also want to take a look at what Xdebug can do when it comes to profiling : this kind of situation is one of those it is useful for!
The serialize() function is only limited by available memory.
There's no limit enforced by PHP. Serialize returns a bytestream representation (string) of the serialized structure, so you would just get a large string.
The only practical limit is your available memory, since serialization involves creating a string in memory.
There is no limit, but remember that serialization and unserialization has a cost.
Unserialization is exteremely costly.
A less costly way of caching that data would be via var_export() as such (since PHP 5.1.0, it works on objects):
$largeArray = array(1,2,3,'hello'=>'world',4);
file_put_contents('cache.php', "<?php\nreturn ".
var_export($largeArray, true).
';');
You can then simply retrieve the array by doing the following:
$largeArray = include('cache.php');
Resources are usually not cache-able.
Unfortunately, if you have circular references in your array, you'll need to use serialize().
As suggested by Thinker above:
You could use
$string = json_encode($your_array_here);
and to decode it
$array = json_decode($your_array_here, true);
This returns an array. It works well even if the encoded array was multilevel.
Ok... more numbers! (PHP 5.3.0 OSX, no opcode cache)
#Pascal's code on my machine for n=1 at 10k iters produces:
float(18.884856939316)
int(1075900)
I add unserialize() to the above as so.
$num = 1;
$list = array_fill(0, 5000, str_repeat('1234567890', $num));
$before = microtime(true);
for ($i=0 ; $i<10000 ; $i++) {
$str = serialize($list);
$list = unserialize($str);
}
$after = microtime(true);
var_dump($after-$before);
var_dump(memory_get_peak_usage());
produces
float(50.204112052917)
int(1606768)
I assume the extra 600k or so are the serialized string.
I was curious about var_export and its include/eval partner $str = var_export($list, true); instead of serialize() in the original produces
float(57.064643859863)
int(1066440)
so just a little less memory (at least for this simple example) but way more time already.
adding in eval('$list = '.$str.';'); instead of unserialize in the above produces
float(126.62566018105)
int(2944144)
Indicating theres probably a memory leak somewhere when doing eval :-/.
So again, these aren't great benchmarks (I really should isolate the eval/unserialize by putting the string in a local var or something, but I'm being lazy) but they show the associated trends. var_export seems slow.
Nope, there is no limit and this:
set_time_limit(0);
ini_set('memory_limit ', -1);
unserialize('s:2000000000:"a";');
is why you should have safe.mode = On or a extension like Suhosin installed, otherwise it will eat up all the memory in your system.
I think better than serialize is json_encode function. It got a drawback, that associative arrays and objects are not distinguished, but string result is smaller and easier to read by human, so also to debug and edit.
If you want to cache it (so I assume performance is the issue), use apc_add instead to avoid the performance hit of converting it to a string + gain cache in memory.
As stated above the only size limit is available memory.
A few other gotchas:
serialize'd data is not portable between multi-byte and single-byte character encodings.
PHP5 classes include NUL bytes that can cause havoc with code that doesn't expect them.
Your use case sounds like you're better off using a database to do that rather than relying solely on PHP's available resources. The advantages to using something like MySQL instead is that it's specifically engineered with memory management in mind for such things as storage and lookup.
It's really no fun constantly serializing and unserializing data just to update or change a few pieces of information.
I've just come across an instance where I thought I was hitting an upper limit of serialisation.
I'm persisting serialised objects to a database using a mysql TEXT field.
The limit of the available characters for a single-byte characters is 65,535 so whilst I can serialize much larger objects than that with PHP It's impossible to unserialize them as they are truncated by the limit of the TEXT field.
i have a case in which unserialize throws an exception on a large serialized object, size: 65535 (the magic number: 16bits full bit = 65536)

Categories