php memory limit unlimit but still out of Memory - php

i have a PHP CLI Application which needs a lot of memory. I set the memory_limit to -1 in the php.ini. Unfortunately i still get this issue:
Fatal error: Out of memory (allocated 870318080) (tried to allocate 138412032 bytes)
At this time my System had more than 4 GB of free memory (I have 12 GB installed). So this is not a hardware limit.
I also tried to set it in the code:
<?php
ini_set('memory_limit','-1');
?>
What i also did was to set it to 2048M but the error was still there. It's always around 1GB so is there any hard coded 1GB limit?
What my System looks like:
Windows 7
PHP 5.6.20 (Thread safe 32bit without Suhosin and without safe_mode)
UPDATE:
Here is as Script to reproduce it:
<?php
ini_set("memory_limit", -1);
echo formatSizeUnits(memory_get_usage()).PHP_EOL;
for($i=0;$i<=10; $i++) {
$content[$i] = file_get_contents("http://mirror.softaculous.com/apache/lucene/solr/6.0.0/solr-6.0.0.zip");
echo formatSizeUnits(memory_get_usage()).PHP_EOL;
}
function formatSizeUnits($bytes)
{
if ($bytes >= 1073741824)
{
$bytes = number_format($bytes / 1073741824, 2) . ' GB';
}
elseif ($bytes >= 1048576)
{
$bytes = number_format($bytes / 1048576, 2) . ' MB';
}
elseif ($bytes >= 1024)
{
$bytes = number_format($bytes / 1024, 2) . ' KB';
}
elseif ($bytes > 1)
{
$bytes = $bytes . ' bytes';
}
elseif ($bytes == 1)
{
$bytes = $bytes . ' byte';
}
else
{
$bytes = '0 bytes';
}
return $bytes;
}
?>
My Output:
php test.php
123.37 KB
164.25 MB
328.37 MB
492.49 MB
PHP Fatal error: Out of memory (allocated 688914432) (tried to allocate 171966464 bytes) in test.php on line 12
Fatal error: Out of memory (allocated 688914432) (tried to allocate 171966464 bytes) in test.php on line 12
UPDATE 2:
I installed a x64 PHP 5.6.20. Now I'm able to get over this limit. Is this an official limit in x86 PHP?

I believe the theoretical limit a 32-bit process can handle is 3GB (there're of course tricks) but PHP in particular can hardly reach 1GB before crashing, at least in my experience.
You should at least attempt to optimise your code because to begin with you're loading a collection of ZIP archives in memory and there's probably not a good reason to.

Related

PHP unpack overlfow variable memory limit

im currently try to echo the raw bytes of a DLL File in hexa format, the unpack function overflow the variable memory limit ( not setable by my hosting service atm), is there any method to read it by parts into 3 or more variables, or other methods to output the bytes and echo these?
the file size is around 1,98MB ( 1.990.656 BYTES ) ( yes i know the buffer is much bigger in php).
Following error occured:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 67108872 bytes)
Thanks for any help.
ini_set('memory_limit', '-1');
$fileName= "driver.dll";
$fsize = filesize($fileName);
$hex = unpack("C*", file_get_contents($fileName));
$hexa_string= "";
foreach($hex as $v)
{
$hexa_string.= sprintf("%02X", $v);
}
echo $hexa_string;
You'd have to use the c wrappers for file manipulation: fseek
$size = 1024 * 1000;
$handle = fopen($file, 'r');
fseek($handle, -$size);
$limitedContent = fread($handle, $size);

Allowed memory size of 134217728 bytes exhausted (tried to allocate 62926848 bytes)

I am facing problem Memory size exhausted in laravel 5.5.
Its before means 5.4 version my code is working but not now.
For this i increased memory size from php.ini file memory_limit 1024M. but not working.
basically i am converting Base64 file format file and then storing into my local storage of pc or server.
Controller Code
public static function convertBase64ToFile ( $file , $dir )
{
$pos = strpos($file, ';');
$type = explode(':', substr($file, 0, $pos))[1];
$format = explode('/',$type);
$exploded = explode(',', $file);
$decoded = base64_decode($exploded[1]);
if(str_contains($exploded[0], $format[1]))
{ $extension = $format[1];}
$filename = str_random().'.'.$extension;
$path = public_path().$dir.$filename;
file_put_contents($path, $decoded);
return $filename;
}
message:
"Allowed memory size of 134217728 bytes exhausted (tried to allocate 65015808 bytes)", "exception": "Symfony\Component\Debug\Exception\FatalErrorException",
In wamp you got 2 php.ini files. One is in \wamp\bin\php\php.x.y.z but this one is only for CLI, and the second is in \wamp\bin\apache\apache2.x.y\bin\. You should check the second one

Allowed memory size Laravel (unzipping file with Laravel)

I'm trying to unzip big file unzip big dump file and I run into this common problem:
local.ERROR: Symfony\Component\Debug\Exception\FatalErrorException:
Allowed memory size of 134217728 bytes exhausted (tried to allocate
123732000 bytes) in /Users/ ...
I know that I can increase memory limit and should work, but I think the problem is in my code and I'm doing something wrong:
public function unzip() {
// unzip file
// set input and output files
$out = 'storage/app/dump/auct_lots_full.sql';
$in = 'storage/app/dump/auct_lots_full.sql.bz2';
// decompress file using BZIP2
if (file_exists($in)) {
$data = '';
$bz = bzopen($in, 'r') or die('ERROR: Cannot open input file!');
while (!feof($bz)) {
$data .= bzread($bz, 4096) or die('ERROR: Cannot read from input file');;
}
bzclose($bz);
file_put_contents($out, $data) or die('ERROR: Cannot write to output file!');
echo 'Decompression complete.';
}

php.ini max_upload_file size falling back to default 2M

I have to upload 5.7MB database in drupal-7 using module "backup and migrate". But, when I upload the file it throws out following error:
The file email839805758.zip could not be saved, because it exceeds 2 MB, the maximum allowed size for uploads.
I have changed post_max_size = 20M and upload_max_filesize = 40M in php.ini file and created user.ini in /etc/php/5.6/apache2/conf.d/user.ini. and pasted post_max_size and upload_max_filesize greater than 2M. I have checked phpinfo(). It just gives the default value to 2M. Does anybody have any solution to such kind of scenerio in drupal 7?
I have found some additional stuff in drupal backup_migrate.module file which might be the barrier. Help me crack this function.
/**
* A custom version of format size which treats 1GB as 1000 MB rather than 1024 MB
* This is a more standard and expected version for storage (as opposed to memory).
*/
function backup_migrate_format_size($size, $langcode = LANGUAGE_NONE) {
$precision = 2;
$multiply = pow(10, $precision);
if ($size == 0) {
return t('0 bytes', array(), array('langcode' => $langcode));
}
if ($size < 1024) {
return format_plural($size, '1 byte', '#count bytes', array(), array('langcode' => $langcode));
}
else {
$size = ceil($size * $multiply / 1024);
$string = '#size KB';
if ($size >= (1024 * $multiply)) {
$size = ceil($size / 1024);
$string = '#size MB';
}
if ($size >= 1000 * $multiply) {
$size = ceil($size / 1000);
$string = '#size GB';
}
if ($size >= 1000 * $multiply) {
$size = ceil($size / 1000);
$string = '#size TB';
}
return t($string, array('#size' => round($size/$multiply, $precision)), array('langcode' => $langcode));
}
}
Put in phpinfo() in your script and see which php.ini file you are using. Go into that file and change those values. Please keep in mind the daemon has to be restarted (php-fpm or apache) in order for the changes to be activated.
If you can not do that for some reason, you can always use the ini_set() function locally while this is highly discouraged!
Check with phpinfo() what php config file was used. To me it looks like you edited wrong php.ini. Also, don't forget to restart your web server (Apache?).

How to ocassionally shorten files?

This is the code I use:
if(mt_rand(0,20000)==0)
{
$lines = file($fileName);
if (count($lines)>50000)
{
$lines=array_slice($lines, -50000, 50000, true);
}
$result=implode("\n",lines);
file_put_contents($fileName, $result . "\n",FILE_APPEND);
}
I often got this error:
[25-Nov-2013 23:20:40 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 33 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 02:41:54 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 27 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 09:56:49 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 72 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 12:44:32 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2097152 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 13:53:31 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2097152 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
I guess reading the whole file may not be a good idea if we just want to shorten a file by erasing the beginning of the file.
Knows any alternative?
fopen fwrite fseek will probably come in handy for you
I think you only want the last 50000 lines in your file.
if(mt_rand(0,20000)==0)
{
$tmp_file = $fileName . '.tmp';
$cmd = "tail -n 50000 $fileName > $tmp_file";
exec($cmd);
rename($tmp_file, $fileName);
}
update for pure php
I make a file about 100,000 lines:
<?php
$file_name = 'tmp.dat';
$f = fopen($file_name, 'w');
for ($i = 0; $i < 1000000; $i++)
{
fwrite($f, str_pad($i, 100, 'x') . "\n");
}
fclose($f);
This file is about 97M.
[huqiu#localhost home]$ ll -h tmp.dat
-rw-rw-r-- 1 huqiu huqiu 97M Nov 27 06:08 tmp.dat
read the last 50000 lines
<?php
$file_name = 'tmp.dat';
$remain_count = 50000;
$begin_time = microtime(true);
$temp_file_name = $file_name . '.tmp';
$fp = fopen($file_name, 'r');
$total_count = 0;
while(fgets($fp))
{
$total_count++;
}
echo 'total count: ' . $total_count . "\n";
if ($total_count > $remain_count)
{
$start = $total_count - $remain_count;
echo 'start: ' . $start . "\n";
$temp_fp = fopen($temp_file_name, 'w');
$index = 0;
rewind($fp);
while($line = fgets($fp))
{
$index++;
if ($index > $start)
{
fwrite($temp_fp, $line);
}
}
fclose($temp_fp);
}
fclose($fp);
echo 'time: ' . (microtime(true) - $begin_time), "\n";
rename($temp_file_name, $file_name);
elapsed time: 0.63908791542053
total count: 1000000
start: 950000
time: 0.63908791542053
the result:
[huqiu#localhost home]$ ll -h tmp.dat
-rw-rw-r-- 1 huqiu huqiu 4.9M Nov 27 06:23 tmp.dat
Why not fseek the pointer to a position past the point you want to eliminate? You might also have better luck using fpassthru to save some memory.

Categories