PHP unpack overlfow variable memory limit - php

im currently try to echo the raw bytes of a DLL File in hexa format, the unpack function overflow the variable memory limit ( not setable by my hosting service atm), is there any method to read it by parts into 3 or more variables, or other methods to output the bytes and echo these?
the file size is around 1,98MB ( 1.990.656 BYTES ) ( yes i know the buffer is much bigger in php).
Following error occured:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 67108872 bytes)
Thanks for any help.
ini_set('memory_limit', '-1');
$fileName= "driver.dll";
$fsize = filesize($fileName);
$hex = unpack("C*", file_get_contents($fileName));
$hexa_string= "";
foreach($hex as $v)
{
$hexa_string.= sprintf("%02X", $v);
}
echo $hexa_string;

You'd have to use the c wrappers for file manipulation: fseek
$size = 1024 * 1000;
$handle = fopen($file, 'r');
fseek($handle, -$size);
$limitedContent = fread($handle, $size);

Related

PHP Memory access

Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);

Allowed memory size of 134217728 bytes exhausted (tried to allocate 62926848 bytes)

I am facing problem Memory size exhausted in laravel 5.5.
Its before means 5.4 version my code is working but not now.
For this i increased memory size from php.ini file memory_limit 1024M. but not working.
basically i am converting Base64 file format file and then storing into my local storage of pc or server.
Controller Code
public static function convertBase64ToFile ( $file , $dir )
{
$pos = strpos($file, ';');
$type = explode(':', substr($file, 0, $pos))[1];
$format = explode('/',$type);
$exploded = explode(',', $file);
$decoded = base64_decode($exploded[1]);
if(str_contains($exploded[0], $format[1]))
{ $extension = $format[1];}
$filename = str_random().'.'.$extension;
$path = public_path().$dir.$filename;
file_put_contents($path, $decoded);
return $filename;
}
message:
"Allowed memory size of 134217728 bytes exhausted (tried to allocate 65015808 bytes)", "exception": "Symfony\Component\Debug\Exception\FatalErrorException",
In wamp you got 2 php.ini files. One is in \wamp\bin\php\php.x.y.z but this one is only for CLI, and the second is in \wamp\bin\apache\apache2.x.y\bin\. You should check the second one

Allowed memory size Laravel (unzipping file with Laravel)

I'm trying to unzip big file unzip big dump file and I run into this common problem:
local.ERROR: Symfony\Component\Debug\Exception\FatalErrorException:
Allowed memory size of 134217728 bytes exhausted (tried to allocate
123732000 bytes) in /Users/ ...
I know that I can increase memory limit and should work, but I think the problem is in my code and I'm doing something wrong:
public function unzip() {
// unzip file
// set input and output files
$out = 'storage/app/dump/auct_lots_full.sql';
$in = 'storage/app/dump/auct_lots_full.sql.bz2';
// decompress file using BZIP2
if (file_exists($in)) {
$data = '';
$bz = bzopen($in, 'r') or die('ERROR: Cannot open input file!');
while (!feof($bz)) {
$data .= bzread($bz, 4096) or die('ERROR: Cannot read from input file');;
}
bzclose($bz);
file_put_contents($out, $data) or die('ERROR: Cannot write to output file!');
echo 'Decompression complete.';
}

Reading large portions of files for hashing/checksums

If I have three get parameters:
$filename = $_GET['filename'];
$start = $_GET['start'];
$size = $_GET['size'];
And I am reading a chunk of the file like so:
$handle = fopen($basepath . $filename, "rb");
fseek($handle, $start);
$contents = fread($handle, $size);
echo md5($contents);
How may I read large portions of a file (anywhere from 1mb to 1gb) and create a hash or checksum of its contents without needing to allocate enough memory for the entire read?
At the moment if I try to hash a too large of part of the file I get a memory error since php can not allocated enough memory (roughly 400mb).
Is there a hashing function in which I can digest parts of the file at a time rather than the entire contents at once (for example starting at $start read 100kb blocks and feed it to the function until $size is met)? And how would I read the file in chunks so that I would start at $start and read $size bytes?
If there is not such a hashing or checksum function that supports feeding chunks of the data pieces at a time, would file_get_contents() fix the issue of allocating memory for large reads? I am not entirely sure how that function works.
Thanks.
http://php.net/manual/en/function.hash-update.php
<?php
define('CHUNK', 65536);
//$file = 'alargefile.img';
//$start = 256 * 1024 * 1024;
//$size = 512 * 1024 * 1024;
$fp = fopen($file, "r");
fseek($fp, $start);
$ctx = hash_init('md5');
while ($size > 0) {
$buffer = fread($fp, min($size, CHUNK));
hash_update($ctx, $buffer);
$size -= CHUNK;
}
$hash = hash_final($ctx);
fclose($fp);
print $hash;
?>

How to ocassionally shorten files?

This is the code I use:
if(mt_rand(0,20000)==0)
{
$lines = file($fileName);
if (count($lines)>50000)
{
$lines=array_slice($lines, -50000, 50000, true);
}
$result=implode("\n",lines);
file_put_contents($fileName, $result . "\n",FILE_APPEND);
}
I often got this error:
[25-Nov-2013 23:20:40 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 33 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 02:41:54 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 27 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 09:56:49 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 72 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 12:44:32 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2097152 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
[26-Nov-2013 13:53:31 UTC] PHP Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 2097152 bytes) in /home/datetrng/public_html/checkblocked.php on line 40
I guess reading the whole file may not be a good idea if we just want to shorten a file by erasing the beginning of the file.
Knows any alternative?
fopen fwrite fseek will probably come in handy for you
I think you only want the last 50000 lines in your file.
if(mt_rand(0,20000)==0)
{
$tmp_file = $fileName . '.tmp';
$cmd = "tail -n 50000 $fileName > $tmp_file";
exec($cmd);
rename($tmp_file, $fileName);
}
update for pure php
I make a file about 100,000 lines:
<?php
$file_name = 'tmp.dat';
$f = fopen($file_name, 'w');
for ($i = 0; $i < 1000000; $i++)
{
fwrite($f, str_pad($i, 100, 'x') . "\n");
}
fclose($f);
This file is about 97M.
[huqiu#localhost home]$ ll -h tmp.dat
-rw-rw-r-- 1 huqiu huqiu 97M Nov 27 06:08 tmp.dat
read the last 50000 lines
<?php
$file_name = 'tmp.dat';
$remain_count = 50000;
$begin_time = microtime(true);
$temp_file_name = $file_name . '.tmp';
$fp = fopen($file_name, 'r');
$total_count = 0;
while(fgets($fp))
{
$total_count++;
}
echo 'total count: ' . $total_count . "\n";
if ($total_count > $remain_count)
{
$start = $total_count - $remain_count;
echo 'start: ' . $start . "\n";
$temp_fp = fopen($temp_file_name, 'w');
$index = 0;
rewind($fp);
while($line = fgets($fp))
{
$index++;
if ($index > $start)
{
fwrite($temp_fp, $line);
}
}
fclose($temp_fp);
}
fclose($fp);
echo 'time: ' . (microtime(true) - $begin_time), "\n";
rename($temp_file_name, $file_name);
elapsed time: 0.63908791542053
total count: 1000000
start: 950000
time: 0.63908791542053
the result:
[huqiu#localhost home]$ ll -h tmp.dat
-rw-rw-r-- 1 huqiu huqiu 4.9M Nov 27 06:23 tmp.dat
Why not fseek the pointer to a position past the point you want to eliminate? You might also have better luck using fpassthru to save some memory.

Categories