When decompressing with gzinflate, I found that - under certain
circumstances - the following code results in out-of-memory errors. Tested with PHP 5.3.20 on an 32bit Linux (Amazon Linux AMI on EC2).
$memoryLimit = Misc::bytesFromShorthand(ini_get('memory_limit')); // 256MB
$memoryUsage = memory_get_usage(); // 2MB in actual test case
$remaining = $memoryLimit - $memoryUsage;
$factor = 0.9;
$maxUncompressedSize = max(1, floor($factor * $remaining) - 1000);
$uncompressedData = gzinflate($compressedData, $maxUncompressedSize);
Although, I calculated the size of $maxUncompressedSize conservatively, hoping to give gzinflate sufficient memory, I still get:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 266143484 bytes) in foo.php on line 123
When changing the value of $factor from 0.9 to 0.4, then the error goes away, in this case. In other cases 0.9 is OK.
I wonder:
Is the reason for the error really that gzinflate needs more than double the space of uncompressed data? Is there possibly some other reason? Is $remaining really the remaining memory at disposal to the application?
It is indeed possible. IMHO, the issue lies with memory_get_usage(true).
Using true should give a higher memory usage value, because should take everything into account.
Related
I'm creating a script that needs to exit when it hits a certain amount of memory usage. Here's my test script:
<?php
$memory_sapper = array();
$i = 0;
while (True){
array_push($memory_sapper, $i);
print($i);
print(PHP_EOL);
print((memory_get_usage()/1024)/1024 . " Mb");
print(PHP_EOL);
print(memory_get_usage() . " bytes");
print(PHP_EOL);
if ($i == 10000){
print("Memory Limit: " . ini_get('memory_limit'));
print(PHP_EOL);
exit;
}
$i++;
}
When I run php -d memory_limit=0.5M test.php, memory usage starts at 0.3M and grows to 0.8M. I expected the script to error around $i == 5000 but it's not, it goes past the memory limit of 0.5M.
print("Memory Limit: " . ini_get('memory_limit')); in the script shows that the memory limit is set to 0.5M.
Even php -d memory_limit=0 test.php doesn't keep the script from running.
If I increase $i to 100k, the script finally errors at 1.38 Mb or 1447328 bytes according to memory_get_usage. I'm unsure why it does't error sooner. Also, the number of bytes according to memory_get_usage isn't very close to the number of bytes recorded in the error. Here's the error message:
PHP Fatal error: Allowed memory size of 2097152 bytes exhausted (tried to allocate 2097160 bytes) in ~/test.php on line 7
Any insights would be valuable. Thank you!
"0.5M" is not a valid value for memory_limit. If you look up memory_limit in the PHP manual, it links to this FAQ about the shorthand notation it accepts, which gives exactly that example:
Note that the numeric value is cast to int; for instance, 0.5M is interpreted as 0.
So to set to "half a megabyte", you need to specify it in a whole number of kilobytes or bytes: memory_limit=500k or memory_limit=512000
Opening a JPEG image using imagecreatefromjpeg can easily lead to fatal errors, because the memory needed exeeds the memory_limit.
A .jpg file that is less than 100Kb in size can easily exceed 2000x2000 pixels - which will take about 20-25MB of memory when opened. "The same" 2000x2000px image may take up 5MB on the disk using a different compression level.
So I obviously cannot use the filesize to determine if it can be opened safely.
How can I determine if a file will fit in memory before opening it, so I can avoid fatal errors?
According to several sources the memory needed is up to 5 bytes per pixel depending on a few different factors such as bit-depth. My own tests confirm this to be roughly true.
On top of that there is some overhead that needs to be accounted for.
But by examining the image dimensions - which can easily be done without loading the image - we can roughly estimate the memory needed and compare it with (an estimate of) the memory available like this:
$filename = 'black.jpg';
//Get image dimensions
$info = getimagesize($filename);
//Each pixel needs 5 bytes, and there will obviously be some overhead - In a
//real implementation I'd probably reserve at least 10B/px just in case.
$mem_needed = $info[0] * $info[1] * 6;
//Find out (roughly!) how much is available
// - this can easily be refined, but that's not really the point here
$mem_total = intval(str_replace(array('G', 'M', 'K'), array('000000000', '000000', '000'), ini_get('memory_limit')));
//Find current usage - AFAIK this is _not_ directly related to
//the memory_limit... but it's the best we have!
$mem_available = $mem_total - memory_get_usage();
if ($mem_needed > $mem_available) {
die('That image is too large!');
}
//Do your thing
$img = imagecreatefromjpeg('black.jpg');
This is only tested superficially, so I'd suggest further testing with a lot of different images and using these functions to check that the calculations are fairly correct in your specific environment:
//Set some low limit to make sure you will run out
ini_set('memory_limit', '10M');
//Use this to check the peak memory at different points during execution
$mem_1 = memory_get_peak_usage(true);
Using php.exe 5.2.17.17 on Windows 7, this:
include_once('simple_html_dom.php');
function fix($setlink)
{
$setaddr = $setlink->href;
$filename="..\\".urldecode($setaddr);
$set=file_get_contents($filename);
$setstr = str_get_html($set);
// Do stuff requiring whole file
unset($set);
unset($setstr);
}
$setindexpath = "..\index.htm";
foreach(file_get_html($setindexpath)->find('a.setlink') as $setlink)
{
fix($setlink);
}
(relying on external data files) fails thus:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in [snip]\simple_html_dom.php on line 620
"function fix" is a suggestion from the answer to a similar question here. unset() is wishful thinking :-)
How may I avoid the continuing consumption of memory by the strings unused on the next loop iteration? Without defacing the code too much. And while providing the whole file as string.
try $setstr->clear(); before unset($setstr);
see http://simplehtmldom.sourceforge.net/manual_faq.htm#memory_leak
side note: $setstr seems to be a misnomer; it's not a string but the dom repesentation of the html doc.
I am receiving the following error on a script when it reaches the bind section of this code.
Error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)
Code:
$result = $dbc->prepare('SELECT make, line, model, name, year, image, processor_family, processor_speeds, shipped_ram, max_ram, ram_type, video_card, video_memory, screen_type, screen_size, hard_drives, internal_drives, external_drives, audio_card, speakers, microphone, networking, external_ports, internal_ports, description, driver_link, manufacturer_link FROM laptop_archive where name=?');
$result->bind_param('s', $name);
$result->execute();
$result->bind_result($make, $line, $model, $name, $year, $image, $processor_family, $processor_speeds, $shipped_ram, $max_ram, $ram_type, $video_card, $video_memory, $screen_type, $screen_size, $hard_drive, $internal_drives, $external_drives, $audio_card, $speakers, $microphone, $networking, $external_ports, $internal_ports, $description, $driver_link, $manufacturer_link);
The portion of the database that it is attempting to access has a number of fields to retrieve although none of them contain a large amount of data with the majority being around 20 characters and the description at around 300 characters.
I have had a search which reveals a few answers although none of them have worked for me, the code is running on a VPS with 512MB RAM and has a memory limit of 256MB as set in php.ini.
The amount of memory to be allocated is 4GB which seems extremely excessive for what is in the database, am I missing something stupid here.
normal times it shouldnt take any huge memory because you didnt fetch the data.
$res = $mysqli->query("SELECT * FROM laptop_archive where name=[REPLACE THIS] limit 1")#
$res->fetch_assoc()
please try this one :)
The issue appears to be caused by the use of LONGTEXT in the database, after changing the type to VARCHAR the issue has gone away.
I had this error two days ago. From what I have learned about it the problems lies in the size of the column your selecting from. In my case I was selecting a pdf and I had to column as a long blob which is 4294967296 bytes. The server was allocating that much to grab the file regardless of the size of the file. So I had to change the column to a medium blob and it works fine. So in your case it looks like it would be the image. I would change that to a medium blob and it should work fine. Otherwise you would have to configure your server to allow bigger allocations of grabs.
Can JavaScript string store 100K characters?
I've written a script where a string from PHP is passed to a variable in JavaScript. It works fine when it is cut short to almost ten thousand characters but breaks the script when attempting to pass the entire string whose length is a bit greater than 100K. No errors could be found though. Is there any solution for this as to any way of increasing character limit of JavaScript variable? I'm just a beginner. Would appreciate is some one could find a solution for this.
The ECMAScript Standard ECMA-262 (6th Edition, June 2015) says
6.1.4 The String Type
The String type is the set of all ordered sequences of zero or more 16-bit unsigned integer values ("elements") up to a maximum length of 253-1 elements.
So don't plan on using more than 9,007,199,254,740,991 or about 9 quadrillion characters. Of course, you should be prepared for systems which cannot allocate 18 PB chunks of memory, as this is not required for conforming ECMAScript implementations.
I think the question is asking about the practical limit, not the spec limit. And, no, it is not always the amount of RAM you have. I have x86_64 24GB PC running Linux Mint with x86_64 Firefox and x86_64 Chrome, and the limits I ran into were:
1,073,741,822 limit in Firefox 84
536,870,888 limit in Chrome 87
Any higher and Firefox throws a Uncaught RangeError: repeat count must be less than infinity and not overflow maximum string size, whereas Chrome throws Uncaught RangeError: Invalid string length. Use the following snippet to run a binary search for the max string length in your browser:
for (var startPow2 = 1; startPow2 < 9007199254740992; startPow2 *= 2)
try {" ".repeat(startPow2);} catch(e) {
break;
}
var floor = Math.floor, mask = floor(startPow2 / 2);
while (startPow2 = floor(startPow2 / 2))
try {
" ".repeat(mask + startPow2);
mask += startPow2; // the previous statement succeeded
} catch(e) {}
console.log("The max string length for this browser is " + mask);
There is no theorical limit to JS or PHP on the size of their strings.
I think there are a few possible situations.
Firstly, check that you are not sending your string via HTTP GET. There is a maximum size for GET and i think its dependent on your web server.
Secondly, if you do use POST, check in php.ini for post_max_size and see if it is smaller than the string size you are sending to it as well as your .htacccess file to see if php_value post_max_size is not too small.
Thirdly, check that in php.ini that your memory_limit does not restrict the size of memory that your script can use.
hope this helps.