Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);
Related
im currently try to echo the raw bytes of a DLL File in hexa format, the unpack function overflow the variable memory limit ( not setable by my hosting service atm), is there any method to read it by parts into 3 or more variables, or other methods to output the bytes and echo these?
the file size is around 1,98MB ( 1.990.656 BYTES ) ( yes i know the buffer is much bigger in php).
Following error occured:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 67108872 bytes)
Thanks for any help.
ini_set('memory_limit', '-1');
$fileName= "driver.dll";
$fsize = filesize($fileName);
$hex = unpack("C*", file_get_contents($fileName));
$hexa_string= "";
foreach($hex as $v)
{
$hexa_string.= sprintf("%02X", $v);
}
echo $hexa_string;
You'd have to use the c wrappers for file manipulation: fseek
$size = 1024 * 1000;
$handle = fopen($file, 'r');
fseek($handle, -$size);
$limitedContent = fread($handle, $size);
If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>
I have a PHP loop whereby I read and process the contents of several files.
<?php
foreach($files as $file)
{
$f = fopen($file, 'r');
$content = fread($f, filesize($file));
import($content);
fclose($f);
}
?>
However, after several iterations of the loop, I am still getting a memory exhausted error on the fread() line - my understanding was that using fclose() would free up the memory of the resource, but is this not the case?
I have read a number of other questions on the matter and they all referenced using fclose(), which I was already doing. I'm also dumping out memory_get_usage()/memory_get_peak_usage() and they just keep going up until it fails.
Am I misunderstanding how fclose() handles freeing up memory?
<?php
foreach($files as $file)
{
$f = fopen($file, 'r');
$content = fread($f, filesize($file));
import($content);
fclose($f); // this close file
unset($content); // this free memory allocated to content of file
}
?>
I have a server that grabs mp3 audio buffers from a source and writes it into a file via php. It truncates the beginning of the file so the file size never exceeds 2 MB. At the same time a client is streaming the mp3 by seeking to the end and reading if there is any new data. The problem is when the file get's truncated the position the client was reading at changes.
This is the client side that streams the audio:
$handle = fopen('cool.mp3', "r");
$err = fseek($handle, 0, SEEK_END);
while(file_exists($file_lock)){ // cool.mp3.lock means stream is still going
$data = fread($handle, 1024);
echo $data;
ob_flush();
flush();
}
I use this on the server to write data as I get it:
$data = "audio frames....";
clearstatcache();
$file = 'cool.mp3';
if(filesize($file) > 1024*200){ //2 MB
ftruncatestart($file, 1024*25); //Trim Down By Deleting Front
}
file_put_contents($file, $data, FILE_APPEND);
I'm trying to unzip a 14MB archive with PHP with code like this:
$zip = zip_open("c:\kosmas.zip");
while ($zip_entry = zip_read($zip)) {
$fp = fopen("c:/unzip/import.xml", "w");
if (zip_entry_open($zip, $zip_entry, "r")) {
$buf = zip_entry_read($zip_entry, zip_entry_filesize($zip_entry));
fwrite($fp,"$buf");
zip_entry_close($zip_entry);
fclose($fp);
break;
}
zip_close($zip);
}
It fails on my localhost with 128MB memory limit with the classic "Allowed memory size of blablabla bytes exhausted". On the server, I've got 16MB limit, is there a better way to do this so that I could fit into this limit? I don't see why this has to allocate more than 128MB of memory. Thanks in advance.
Solution:
I started reading the files in 10Kb chunks, problem solved with peak memory usage arnoud 1.5MB.
$filename = 'c:\kosmas.zip';
$archive = zip_open($filename);
while($entry = zip_read($archive)){
$size = zip_entry_filesize($entry);
$name = zip_entry_name($entry);
$unzipped = fopen('c:/unzip/'.$name,'wb');
while($size > 0){
$chunkSize = ($size > 10240) ? 10240 : $size;
$size -= $chunkSize;
$chunk = zip_entry_read($entry, $chunkSize);
if($chunk !== false) fwrite($unzipped, $chunk);
}
fclose($unzipped);
}
Why do you read the whole file at once?
$buf = zip_entry_read($zip_entry, zip_entry_filesize($zip_entry));
fwrite($fp,"$buf");
Try to read small chunks of it and writing them to a file.
Just because a zip is less than PHP's memory limit & perhaps the unzipped is as well, doesn't take account of PHP's overhead generally and more importantly the memory needed to actually unzip the file, which whilst I'm not expert with compression I'd expect may well be a lot more than the final unzipped size.
For a file of that size, perhaps it is better if you use shell_exec() instead:
shell_exec('unzip archive.zip -d /destination_path');
PHP must not be running in safe mode and you must have access to both shell_exec and unzip for this method to work.
Update:
Given that command line tools are not available, all I can think of is to create a script and send the file to a remote server where command line tools are available, extract the file and download the contents.
function my_unzip($full_pathname){
$unzipped_content = '';
$zd = gzopen($full_pathname, "r");
while ($zip_file = gzread($zd, 10000000)){
$unzipped_content.= $zip_file;
}
gzclose($zd);
return $unzipped_content;
}