I'm trying to unzip big file unzip big dump file and I run into this common problem:
local.ERROR: Symfony\Component\Debug\Exception\FatalErrorException:
Allowed memory size of 134217728 bytes exhausted (tried to allocate
123732000 bytes) in /Users/ ...
I know that I can increase memory limit and should work, but I think the problem is in my code and I'm doing something wrong:
public function unzip() {
// unzip file
// set input and output files
$out = 'storage/app/dump/auct_lots_full.sql';
$in = 'storage/app/dump/auct_lots_full.sql.bz2';
// decompress file using BZIP2
if (file_exists($in)) {
$data = '';
$bz = bzopen($in, 'r') or die('ERROR: Cannot open input file!');
while (!feof($bz)) {
$data .= bzread($bz, 4096) or die('ERROR: Cannot read from input file');;
}
bzclose($bz);
file_put_contents($out, $data) or die('ERROR: Cannot write to output file!');
echo 'Decompression complete.';
}
Related
Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);
im currently try to echo the raw bytes of a DLL File in hexa format, the unpack function overflow the variable memory limit ( not setable by my hosting service atm), is there any method to read it by parts into 3 or more variables, or other methods to output the bytes and echo these?
the file size is around 1,98MB ( 1.990.656 BYTES ) ( yes i know the buffer is much bigger in php).
Following error occured:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 67108872 bytes)
Thanks for any help.
ini_set('memory_limit', '-1');
$fileName= "driver.dll";
$fsize = filesize($fileName);
$hex = unpack("C*", file_get_contents($fileName));
$hexa_string= "";
foreach($hex as $v)
{
$hexa_string.= sprintf("%02X", $v);
}
echo $hexa_string;
You'd have to use the c wrappers for file manipulation: fseek
$size = 1024 * 1000;
$handle = fopen($file, 'r');
fseek($handle, -$size);
$limitedContent = fread($handle, $size);
I am facing problem Memory size exhausted in laravel 5.5.
Its before means 5.4 version my code is working but not now.
For this i increased memory size from php.ini file memory_limit 1024M. but not working.
basically i am converting Base64 file format file and then storing into my local storage of pc or server.
Controller Code
public static function convertBase64ToFile ( $file , $dir )
{
$pos = strpos($file, ';');
$type = explode(':', substr($file, 0, $pos))[1];
$format = explode('/',$type);
$exploded = explode(',', $file);
$decoded = base64_decode($exploded[1]);
if(str_contains($exploded[0], $format[1]))
{ $extension = $format[1];}
$filename = str_random().'.'.$extension;
$path = public_path().$dir.$filename;
file_put_contents($path, $decoded);
return $filename;
}
message:
"Allowed memory size of 134217728 bytes exhausted (tried to allocate 65015808 bytes)", "exception": "Symfony\Component\Debug\Exception\FatalErrorException",
In wamp you got 2 php.ini files. One is in \wamp\bin\php\php.x.y.z but this one is only for CLI, and the second is in \wamp\bin\apache\apache2.x.y\bin\. You should check the second one
I have to zip search results containing maximum 10000 files, with an approximate dimension of far more than 1Gb.
I create a zip archive and read every file in a for loop whit fread and add the resulting file to the archive.
I never finished adding files because of this error
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1257723 bytes)
but I don't think adding 1Gb or more to the memory_limit value of php.ini could be a solution, because memory resources are limited.
Because of the zip file will stay in memory until it will be closed (or so I read in another question), I made a way to create a series of zip file of 50Mb to avoid the memory leak. But even if the php script create another file zip, it will stop with the same PHP Fatal error on the same file (the 174th).
Why?
Am I doing something wrong?
Any help will be appreciated.
Here is a code snippet of the file creation
$zip = new ZipArchive();
$nomeZipFile = "../tmp/" . $title . ".zip";
for ($x = 0; $x < count($risultato); $x++) {
$numFiles = 0;
$dir = '../tmp';
if (file_exists($nomeZipFile)) {
if (filesize($nomeZipFile) > 52428800) {
// filename count
if ($handle = opendir($dir)) {
while (($file = readdir($handle)) !== false) {
if (!in_array($file, array('.', '..')) && !is_dir($dir . $file))
$numFiles++;
}
}
$nomeZipFile = '../tmp/' . $title . $numFiles . '.zip';
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
} else {
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
}
} else {
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
}
...
// adding file
$fileDownload = "";
$fDownload = fopen($kt_response->message, "r"); // the file is downloaded though a webservice
while(!feof($fDownload)) { $fileDownload .= fread($fDownload, 1024); flush(); }
$zip->addFromString(filename, $fileDownload);
....
$zip->close();
}
I'm trying to unzip a 14MB archive with PHP with code like this:
$zip = zip_open("c:\kosmas.zip");
while ($zip_entry = zip_read($zip)) {
$fp = fopen("c:/unzip/import.xml", "w");
if (zip_entry_open($zip, $zip_entry, "r")) {
$buf = zip_entry_read($zip_entry, zip_entry_filesize($zip_entry));
fwrite($fp,"$buf");
zip_entry_close($zip_entry);
fclose($fp);
break;
}
zip_close($zip);
}
It fails on my localhost with 128MB memory limit with the classic "Allowed memory size of blablabla bytes exhausted". On the server, I've got 16MB limit, is there a better way to do this so that I could fit into this limit? I don't see why this has to allocate more than 128MB of memory. Thanks in advance.
Solution:
I started reading the files in 10Kb chunks, problem solved with peak memory usage arnoud 1.5MB.
$filename = 'c:\kosmas.zip';
$archive = zip_open($filename);
while($entry = zip_read($archive)){
$size = zip_entry_filesize($entry);
$name = zip_entry_name($entry);
$unzipped = fopen('c:/unzip/'.$name,'wb');
while($size > 0){
$chunkSize = ($size > 10240) ? 10240 : $size;
$size -= $chunkSize;
$chunk = zip_entry_read($entry, $chunkSize);
if($chunk !== false) fwrite($unzipped, $chunk);
}
fclose($unzipped);
}
Why do you read the whole file at once?
$buf = zip_entry_read($zip_entry, zip_entry_filesize($zip_entry));
fwrite($fp,"$buf");
Try to read small chunks of it and writing them to a file.
Just because a zip is less than PHP's memory limit & perhaps the unzipped is as well, doesn't take account of PHP's overhead generally and more importantly the memory needed to actually unzip the file, which whilst I'm not expert with compression I'd expect may well be a lot more than the final unzipped size.
For a file of that size, perhaps it is better if you use shell_exec() instead:
shell_exec('unzip archive.zip -d /destination_path');
PHP must not be running in safe mode and you must have access to both shell_exec and unzip for this method to work.
Update:
Given that command line tools are not available, all I can think of is to create a script and send the file to a remote server where command line tools are available, extract the file and download the contents.
function my_unzip($full_pathname){
$unzipped_content = '';
$zd = gzopen($full_pathname, "r");
while ($zip_file = gzread($zd, 10000000)){
$unzipped_content.= $zip_file;
}
gzclose($zd);
return $unzipped_content;
}