I am curious, is there a size limit on serialize in PHP. Would it be possible to serialize an array with 5,000 keys and values so it can be stored into a cache?
I am hoping to cache a users friend list on a social network site, the cache will need to be updated fairly often but it will need to be read almost every page load.
On a single server setup I am assuming APC would be better then memcache for this.
As quite a couple other people answered already, just for fun, here's a very quick benchmark (do I dare calling it that ? ) ; consider the following code :
$num = 1;
$list = array_fill(0, 5000, str_repeat('1234567890', $num));
$before = microtime(true);
for ($i=0 ; $i<10000 ; $i++) {
$str = serialize($list);
}
$after = microtime(true);
var_dump($after-$before);
var_dump(memory_get_peak_usage());
I'm running this on PHP 5.2.6 (the one bundled with Ubuntu jaunty).
And, yes, there are only values ; no keys ; and the values are quite simple : no object, no sub-array, no nothing but string.
For $num = 1, you get :
float(11.8147978783)
int(1702688)
For $num = 10, you get :
float(13.1230671406)
int(2612104)
And, for $num = 100, you get :
float(63.2925770283)
int(11621760)
So, it seems the bigger each element of the array is, the longer it takes (seems fair, actually). But, for elements 100 times bigger, you don't take 100 times much longer...
Now, with an array of 50000 elements, instead of 5000, which means this part of the code is changed :
$list = array_fill(0, 50000, str_repeat('1234567890', $num));
With $num = 1, you get :
float(158.236332178)
int(15750752)
Considering the time it took for 1, I won't be running this for either $num = 10 nor $num = 100...
Yes, of course, in a real situation, you wouldn't be doing this 10000 times ; so let's try with only 10 iterations of the for loop.
For $num = 1 :
float(0.206310987473)
int(15750752)
For $num = 10 :
float(0.272629022598)
int(24849832)
And for $num = 100 :
float(0.895547151566)
int(114949792)
Yeah, that's almost 1 second -- and quite a bit of memory used ^^
(No, this is not a production server : I have a pretty high memory_limit on this development machine ^^ )
So, in the end, to be a bit shorter than those number -- and, yes, you can have numbers say whatever you want them to -- I wouldn't say there is a "limit" as in "hardcoded" in PHP, but you'll end up facing one of those :
max_execution_time (generally, on a webserver, it's never more than 30 seconds)
memory_limit (on a webserver, it's generally not muco more than 32MB)
the load you webserver will have : while 1 of those big serialize-loop was running, it took 1 of my CPU ; if you are having quite a couple of users on the same page at the same time, I let you imagine what it will give ;-)
the patience of your user ^^
But, except if you are really serializing long arrays of big data, I am not sure it will matter that much...
And you must take into consideration the amount of time/CPU-load using that cache might help you gain ;-)
Still, the best way to know would be to test by yourself, with real data ;-)
And you might also want to take a look at what Xdebug can do when it comes to profiling : this kind of situation is one of those it is useful for!
The serialize() function is only limited by available memory.
There's no limit enforced by PHP. Serialize returns a bytestream representation (string) of the serialized structure, so you would just get a large string.
The only practical limit is your available memory, since serialization involves creating a string in memory.
There is no limit, but remember that serialization and unserialization has a cost.
Unserialization is exteremely costly.
A less costly way of caching that data would be via var_export() as such (since PHP 5.1.0, it works on objects):
$largeArray = array(1,2,3,'hello'=>'world',4);
file_put_contents('cache.php', "<?php\nreturn ".
var_export($largeArray, true).
';');
You can then simply retrieve the array by doing the following:
$largeArray = include('cache.php');
Resources are usually not cache-able.
Unfortunately, if you have circular references in your array, you'll need to use serialize().
As suggested by Thinker above:
You could use
$string = json_encode($your_array_here);
and to decode it
$array = json_decode($your_array_here, true);
This returns an array. It works well even if the encoded array was multilevel.
Ok... more numbers! (PHP 5.3.0 OSX, no opcode cache)
#Pascal's code on my machine for n=1 at 10k iters produces:
float(18.884856939316)
int(1075900)
I add unserialize() to the above as so.
$num = 1;
$list = array_fill(0, 5000, str_repeat('1234567890', $num));
$before = microtime(true);
for ($i=0 ; $i<10000 ; $i++) {
$str = serialize($list);
$list = unserialize($str);
}
$after = microtime(true);
var_dump($after-$before);
var_dump(memory_get_peak_usage());
produces
float(50.204112052917)
int(1606768)
I assume the extra 600k or so are the serialized string.
I was curious about var_export and its include/eval partner $str = var_export($list, true); instead of serialize() in the original produces
float(57.064643859863)
int(1066440)
so just a little less memory (at least for this simple example) but way more time already.
adding in eval('$list = '.$str.';'); instead of unserialize in the above produces
float(126.62566018105)
int(2944144)
Indicating theres probably a memory leak somewhere when doing eval :-/.
So again, these aren't great benchmarks (I really should isolate the eval/unserialize by putting the string in a local var or something, but I'm being lazy) but they show the associated trends. var_export seems slow.
Nope, there is no limit and this:
set_time_limit(0);
ini_set('memory_limit ', -1);
unserialize('s:2000000000:"a";');
is why you should have safe.mode = On or a extension like Suhosin installed, otherwise it will eat up all the memory in your system.
I think better than serialize is json_encode function. It got a drawback, that associative arrays and objects are not distinguished, but string result is smaller and easier to read by human, so also to debug and edit.
If you want to cache it (so I assume performance is the issue), use apc_add instead to avoid the performance hit of converting it to a string + gain cache in memory.
As stated above the only size limit is available memory.
A few other gotchas:
serialize'd data is not portable between multi-byte and single-byte character encodings.
PHP5 classes include NUL bytes that can cause havoc with code that doesn't expect them.
Your use case sounds like you're better off using a database to do that rather than relying solely on PHP's available resources. The advantages to using something like MySQL instead is that it's specifically engineered with memory management in mind for such things as storage and lookup.
It's really no fun constantly serializing and unserializing data just to update or change a few pieces of information.
I've just come across an instance where I thought I was hitting an upper limit of serialisation.
I'm persisting serialised objects to a database using a mysql TEXT field.
The limit of the available characters for a single-byte characters is 65,535 so whilst I can serialize much larger objects than that with PHP It's impossible to unserialize them as they are truncated by the limit of the TEXT field.
i have a case in which unserialize throws an exception on a large serialized object, size: 65535 (the magic number: 16bits full bit = 65536)
Related
In my PHP script I need to create an array of >600k integers. Unfortunately my webservers memory_limit is set to 32M so when initializing the array the script aborts with message
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 71 bytes) in /home/www/myaccount/html/mem_test.php on line 8
I am aware of the fact, that PHP does not store the array values as plain integers, but rather as zvalues which are much bigger than the plain integer value (8 bytes on my 64-bit system). I wrote a small script to estimate how much memory each array entry uses and it turns out, that it's pretty exactly 128 bytes. 128!!! I'd need >73M just to store the array. Unfortunately the webserver is not under my control so I cannot increase the memory_limit.
My question is, is there any possibility in PHP to create an array-like structure that uses less memory. I don't need this structure to be associative (plain index-access is sufficient). It also does not need to have dynamic resizing - I know exactly how big the array will be. Also, all elements would be of the same type. Just like a good old C-array.
Edit:
So deceze's solution works out-of-the-box with 32-bit integers. But even if you're on a 64-bit system, pack() does not seem to support 64-bit integers. In order to use 64-bit integers in my array I applied some bit-manipulation. Perhaps the below snippets will be of help for someone:
function push_back(&$storage, $value)
{
// split the 64-bit value into two 32-bit chunks, then pass these to pack().
$storage .= pack('ll', ($value>>32), $value);
}
function get(&$storage, $idx)
{
// read two 32-bit chunks from $storage and glue them back together.
return (current(unpack('l', substr($storage, $idx * 8, 4)))<<32 |
current(unpack('l', substr($storage, $idx * 8+4, 4))));
}
The most memory efficient you'll get is probably by storing everything in a string, packed in binary, and use manual indexing to it.
$storage = '';
$storage .= pack('l', 42);
// ...
// get 10th entry
$int = current(unpack('l', substr($storage, 9 * 4, 4)));
This can be feasible if the "array" initialisation can be done in one fell swoop and you're just reading from the structure. If you need a lot of appending to the string, this becomes extremely inefficient. Even this can be done using a resource handle though:
$storage = fopen('php://memory', 'r+');
fwrite($storage, pack('l', 42));
...
This is very efficient. You can then read this buffer back into a variable and use it as string, or you can continue to work with the resource and fseek.
A PHP Judy Array will use significantly less memory than a standard PHP array, and an SplFixedArray.
I quote "An array with 1 million entries using regular PHP array data structure takes 200MB. SplFixedArray uses around 90 megabytes. Judy uses 8 megs. Tradeoff is in performance, Judy takes about double the time of regular php array implementation."
You could use an object if possible. These often use less memory than array's.
Also SplFixedArray is an good option.
But it really depends on the implementation that you need to do. If you need an function to return an array and are using PHP 5.5. You could use the generator yield to stream the array back.
You can try to use a SplFixedArray, it's faster and take less memory (the doc comment say ~30% less). Test here and here.
Use a string - that's what I'd do. Store it in a string on fixed offsets (16 or 20 digits should do it I guess?) and use substr to get the one needed. Blazing fast write / read, super easy, and 600.000 integers will only take ~12M to store.
base_convert() - if you need something more compact but with minimum effort, convert your integers to base-36 instead of base-10; in this case, a 14-digit number would be stored in 9 alphanumeric characters. You'll need to make 2 pieces of 64-bit ints, but I'm sure that's not a problem. (I'd split them to 9-digit chunks where conversion gives you a 6-char version.)
pack()/unpack() - binary packing is the same thing with a bit more efficiency. Use it if nothing else works; split your numbers to make them fit to two 32-bit pieces.
600K is a lot of elements. If you are open to alternative methods, I personally would use a database for that. Then use standard sql/nosql select syntax to pull things out. Perhaps memcache or redis if you have an easy host for that, such as garantiadata.com. Maybe APC.
Depending on how you are generate the integers, you could potentially use PHP's generators, assuming you are traversing the array and doing something with individual values.
I took the answer by #deceze and wrapped it in a class that can handle 32-bit integers. It is append-only, but you can still use it as a simple, memory-optimized PHP Array, Queue, or Heap. AppendItem and ItemAt are both O(1), and it has no memory overhead. I added currentPosition/currentSize to avoid unnecessary fseek function calls. If you need to cap memory usage and switch to a temporary file automatically, use php://temp instead.
class MemoryOptimizedArray
{
private $_storage;
private $_currentPosition;
private $_currentSize;
const BYTES_PER_ENTRY = 4;
function __construct()
{
$this->_storage = fopen('php://memory', 'rw+');
$this->_currentPosition = 0;
$this->_currentSize = 0;
}
function __destruct()
{
fclose($this->_storage);
}
function AppendItem($value)
{
if($this->_currentPosition != $this->_currentSize)
{
fseek($this->_storage, SEEK_END);
}
fwrite($this->_storage, pack('l', $value));
$this->_currentSize += self::BYTES_PER_ENTRY;
$this->_currentPosition = $this->_currentSize;
}
function ItemAt($index)
{
$itemPosition = $index * self::BYTES_PER_ENTRY;
if($this->_currentPosition != $itemPosition)
{
fseek($this->_storage, $itemPosition);
}
$binaryData = fread($this->_storage, self::BYTES_PER_ENTRY);
$this->_currentPosition = $itemPosition + self::BYTES_PER_ENTRY;
$unpackedElements = unpack('l', $binaryData);
return $unpackedElements[1];
}
}
$arr = new MemoryOptimizedArray();
for($i = 0; $i < 3; $i++)
{
$v = rand(-2000000000,2000000000);
$arr->AddToEnd($v);
print("added $v\n");
}
for($i = 0; $i < 3; $i++)
{
print($arr->ItemAt($i)."\n");
}
for($i = 2; $i >=0; $i--)
{
print($arr->ItemAt($i)."\n");
}
Looking at the file_get_contents I see that documentation states it will load the file as string in memory, but will file() do that for the array it creates?
Does file() store the data in RAM?
Yes, PHP stores strings, objects, arrays, etc. in memory unless explicitly stored to disk.
Here is a great test case about how much memory arrays really use: http://nikic.github.io/2011/12/12/How-big-are-PHP-arrays-really-Hint-BIG.html
From the article:
<?php
$startMemory = memory_get_usage();
$array = range(1, 100000);
echo memory_get_usage() - $startMemory, ' bytes';
How much would you expect it to be? Simple, one integer is 8 bytes (on
a 64 bit unix machine and using the long type) and you got 100000
integers, so you obviously will need 800000 bytes. That’s something
like 0.76 MBs.
Now try and run the above code. You can do it online if you want. This
gives me 14649024 bytes. Yes, you heard right, that’s 13.97 MB -
eightteen times more than we estimated.
I am using the following code in an application based on ZF1:
$select = $db->select()->from('table', array('id', 'int', 'float'))->limit(10000, (($i - 1) * 10000));
$data = $select->query();
while ($row = $data->fetch()) {
# ...
}
This operation is happening in a foreach loop for some 800 times. I output the memory usage for each pass and can see it increasing by about 5MB per pass. I suppose that is because Zend apparently does not free the result from the query once the pass is complete. A simple unset didn't solve the issue. Using fetchAll also did not improve (or change) the situation.
Is there any way to free the result from a Zend_Db_Statement_PDO thus freeing the memory used by it? Or do you suspect another reason?
I believe you want to do this:
$sql = "SELECT something FROM random-table-with-an-obscene-large-amount-of-entries";
$res = $db->query($sql);
while ($row = $res->fetch(Zend_Db::FETCH_NUM)) {
// do some with the data returned in $row
}
Zend_Db::FETCH_NUM - return data in an array of arrays. The arrays are indexed by integers, corresponding to the position of the respective field in the select-list of the query.
Since you overwrite $row on each loop, the memory should be reclaimed. If you are paranoid you can unset($row) at the bottom of the loop I believe. I've not tested this myself recently, but I ran into a batch problem about a year ago that was similar, and I seem to recall using this solution.
Actually the problem was hidden somewhere else:
Inside the loop some integer results were stored in an array for modification at a later planned stage in the workflow.
While one might expect PHP arrays to be small, that is not the case: Arrays grow big really fast and a PHP array is on average 18 times larger than it is to be 'expected'. Watch out while working with arrays, even if you only store integers in them!
In case the linked article disappears sometime:
In this post I want to investigate the memory usage of PHP arrays (and values in general) using the following script as an example, which creates 100000 unique integer array elements and measures the resulting memory usage:
$startMemory = memory_get_usage();
$array = range(1, 100000);
echo memory_get_usage() - $startMemory, ' bytes';
How much would you expect it to be? Simple, one integer is 8 bytes (on a 64 bit unix machine and using the long type) and you got 100000 integers, so you obviously will need 800000 bytes. That’s something like 0.76 MBs.
Now try and run the above code. This gives me 14649024 bytes. Yes, you heard right, that’s 13.97 MB - eightteen times more than we estimated.
I'm looking for a way to measure the amount of data stored in a PHP array. I'm not talking about the number of elements in the array (which you can figure out with count($array, COUNT_RECURSIVE)), but the cumulative amount of data from all the keys and their corresponding values. For instance:
array('abc'=>123); // size = 6
array('a'=>1,'b'=>2); // size = 4
As what I'm interested in is order of magnitude rather than the exact amount (I want to compare the processing memory and time usage versus the size of the arrays) I thought about using the following trick:
strlen(print_r($array,true));
However the amount of overhead coming from print_r varies depending on the structure of the array which doesn't give me consistent results:
echo strlen(print_r(array('abc'=>123),true)); // 27
echo strlen(print_r(array('a'=>1,'b'=>2),true)); // 35
Is there a way (ideally in a one-liner and without impacting too much performance as I need to execute this at run-time on production) to measure the amount of data stored in an array in PHP?
Does this do the trick:
<?php
$arr = array('abc'=>123);
echo strlen(implode('',array_keys($arr)).implode('',$arr));
?>
Short answer: mission impossible
You could try something like:
strlen(serialize($myArray)) // either this
strlen(json_encode($myArray)) // or this
But to approximate the true memory footprint of an array, you will have to do a lot more than that. If you're looking for a ballpark estimate, arrays take 3-8x more than their serialized version, depending on what you store in them and how many elements you have. It increases gradually, in bigger and bigger chunks as your array grows. To give you an idea of what's happening, here's an array estimation function I came up with, after many hours of trying, for one-level arrays only:
function estimateArrayFootprint($a) { // copied from one of my failed quests :(
$size = 0;
foreach($a as $k=>$v) {
foreach([$k,$v] as $x) {
$n = strlen($x);
do{
if($n>8192 ) {$n = (1+($n>>12)<<12);break;}
if($n>1024 ) {$n = (1+($n>> 9)<< 9);break;}
if($n>512 ) {$n = (1+($n>> 8)<< 8);break;}
if($n>0 ) {$n = (1+($n>> 5)<< 5);break;}
}while(0);
$size += $n + 96;
}
}
return $size;
}
So that's how easy it is, not. And again, it's not a reliable estimation, it probably depends on the PHP memory limit, the architecture, the PHP version and a lot more. The question is how accurately do you need this value.
Also let's not forget that these values came from a memory_get_usage(1) which is not very exact either. PHP allocates memory in big blocks in order to avoid a noticeable overhead as your string/array/whatever else grows, like in a for(...) $x.="yada" situation.
I wish I could say anything more useful.
I'm working with a large array which is a height map, 1024x1024 and of course, i'm stuck with the memory limit. In my test machine i can increase the mem limit to 1gb if i want, but in my tiny VPS with only 256 ram, it's not an option.
I've been searching in stack and google and found several "well, you are using PHP not because memory efficiency, ditch it and rewrite in c++" and honestly, that's ok and I recognize PHP loves memory.
But, when digging more inside PHP memory management, I did not find what memory consumes every data type. Or if casting to another type of data reduces mem consumption.
The only "optimization" technique i found was to unset variables and arrays, that's it.
Converting the code to c++ using some PHP parsers would solve the problem?
Thanks!
If you want a real indexed array, use SplFixedArray. It uses less memory. Also, PHP 5.3 has a much better garbage collector.
Other than that, well, PHP will use more memory than a more carefully written C/C++ equivalent.
Memory Usage for 1024x1024 integer array:
Standard array: 218,756,848
SplFixedArray: 92,914,208
as measured by memory_get_peak_usage()
$array = new SplFixedArray(1024 * 1024); // array();
for ($i = 0; $i < 1024 * 1024; ++$i)
$array[$i] = 0;
echo memory_get_peak_usage();
Note that the same array in C using 64-bit integers would be 8M.
As others have suggested, you could pack the data into a string. This is slower but much more memory efficient. If using 8 bit values it's super easy:
$x = str_repeat(chr(0), 1024*1024);
$x[$i] = chr($v & 0xff); // store value $v into $x[$i]
$v = ord($x[$i]); // get value $v from $x[$i]
Here the memory will only be about 1.5MB (that is, when considering the entire overhead of PHP with just this integer string array).
For the fun of it, I created a simple benchmark of creating 1024x1024 8-bit integers and then looping through them once. The packed versions all used ArrayAccess so that the user code looked the same.
mem write read
array 218M 0.589s 0.176s
packed array 32.7M 1.85s 1.13s
packed spl array 13.8M 1.91s 1.18s
packed string 1.72M 1.11s 1.08s
The packed arrays used native 64-bit integers (only packing 7 bytes to avoid dealing with signed data) and the packed string used ord and chr. Obviously implementation details and computer specs will affect things a bit, but I would expect you to get similar results.
So while the array was 6x faster it also used 125x the memory as the next best alternative: packed strings. Obviously the speed is irrelevant if you are running out of memory. (When I used packed strings directly without an ArrayAccess class they were only 3x slower than native arrays.)
In short, to summarize, I would use something other than pure PHP to process this data if speed is of any concern.
In addition to the accepted answer and suggestions in the comments, I'd like to suggest PHP Judy array implementation.
Quick tests showed interesting results. An array with 1 million entries using regular PHP array data structure takes ~200 MB. SplFixedArray uses around 90 megabytes. Judy uses 8 megs. Tradeoff is in performance, Judy takes about double the time of regular php array implementation.
A little bit late to the party, but if you have a multidimensional array you can save a lot of RAM when you store the complete array as json.
$array = [];
$data = [];
$data["a"] = "hello";
$data["b"] = "world";
To store this array just use:
$array[] = json_encode($data);
instead of
$array[] = $data;
If you want to get the arrry back, just use something like:
$myData = json_decode($array[0], true);
I had a big array with 275.000 sets and saved about 36% RAM consumption.
EDIT:
I found a more better way, when you zip the json string:
$array[] = gzencode(json_encode($data));
and unzip it when you need it:
$myData = json_decode(gzdecode($array[0], true));
This saved me nearly 75% of RAM peak usage.