PHP - Memory Limit. Trying to read large file (72mb) - php

I have a script processing large text files.
I am limited however by the size of the files.
I've done some searching on this forum, and i've come to the conclusion that i must process the file line by line, however this brings up quite a bit of issues for me, as i need some detailed info from the file, before i can start processing it.
I've tried adding each line to a variable as below:
$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 256);
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
And just as expected, it did not work.
Could anyone help me out ?

add this on top of your file
ini_set("memory_limit", -1);
Be careful, the page can use all the RAM on the server

$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (($buffer = fgets($handle, 4096))!== false) {
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
OR
$content = "";
$target_File ="/tmp/uploadfile.txt";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
// Process buffer here..
//$content .= $buffer;
}
fclose($handle);
}
NOTE:
If the problem is caused by hitting the memory limit, you can try setting it a higher value (this could work or not depending on php's configuration).
this sets the memory limit to 32 Mb
ini_set("memory_limit","32M");

Related

How can I create an array by parsing a large file? [duplicate]

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

Can't fgets() a 1 gig file?

Code:
$filename = 'Master_List_DeDuped.csv';
$fp = fopen($filename, "r");
while (false !== ($line = fgets($fp))) {
echo $line;
die(); // For Debugging only
}
fclose($fp);
The resulting error:
Warning: fgets(): 3 is not a valid stream resource in /home3/public_html/index.php on line 288
Line 288 is the while statement. The same commands work fine with a smaller file. My file is about 1.1 gigs. Is it just a file size limitation?
Edit: I've tried adding the length parameter to fgets, but the same error shows. http://us2.php.net/fgets
Changed the code a bit based on the example at php.net http://us2.php.net/fgets . The code that works is:
$filename = 'Master_List_DeDuped.csv';
$fp = #fopen($filename, "r");
if ($fp) {
while (($line = fgets($fp, 4096)) !== false) {
echo $line;
die(); // For Debugging only
}
}
fclose($fp);

How to read a large file line by line?

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

How do I prepend file to beginning?

In PHP if you write to a file it will write end of that existing file.
How do we prepend a file to write in the beginning of that file?
I have tried rewind($handle) function but seems overwriting if current content is larger than existing.
Any Ideas?
$prepend = 'prepend me please';
$file = '/path/to/file';
$fileContents = file_get_contents($file);
file_put_contents($file, $prepend . $fileContents);
The file_get_contents solution is inefficient for large files. This solution may take longer, depending on the amount of data that needs to be prepended (more is actually better), but it won't eat up memory.
<?php
$cache_new = "Prepend this"; // this gets prepended
$file = "file.dat"; // the file to which $cache_new gets prepended
$handle = fopen($file, "r+");
$len = strlen($cache_new);
$final_len = filesize($file) + $len;
$cache_old = fread($handle, $len);
rewind($handle);
$i = 1;
while (ftell($handle) < $final_len) {
fwrite($handle, $cache_new);
$cache_new = $cache_old;
$cache_old = fread($handle, $len);
fseek($handle, $i * $len);
$i++;
}
?>
$filename = "log.txt";
$file_to_read = #fopen($filename, "r");
$old_text = #fread($file_to_read, 1024); // max 1024
#fclose(file_to_read);
$file_to_write = fopen($filename, "w");
fwrite($file_to_write, "new text".$old_text);
Another (rough) suggestion:
$tempFile = tempnam('/tmp/dir');
$fhandle = fopen($tempFile, 'w');
fwrite($fhandle, 'string to prepend');
$oldFhandle = fopen('/path/to/file', 'r');
while (($buffer = fread($oldFhandle, 10000)) !== false) {
fwrite($fhandle, $buffer);
}
fclose($fhandle);
fclose($oldFhandle);
rename($tempFile, '/path/to/file');
This has the drawback of using a temporary file, but is otherwise pretty efficient.
When using fopen() you can set the mode to set the pointer (ie. the begginng or end.
$afile = fopen("file.txt", "r+");
'r' Open for reading only; place
the file pointer at the beginning of
the file.
'r+' Open for reading and
writing; place the file pointer at the
beginning of the file.
$file = fopen('filepath.txt', 'r+') or die('Error');
$txt = "/n".$string;
fwrite($file, $txt);
fclose($file);
This will add a blank line in the text file, so next time you write to it you replace the blank line. with a blank line and your string.
This is the only and best trick.

php: readfile_chunked not flushing last 2830 bytes of 29353KB file

I am using a variation of the familiar readfile_chunked in attempt of download for larger files:
function readfile_chunked($filename)
{
$chunk_size = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
while (!feof($handle))
{
$buffer = fread($handle, $chunk_size);
print $buffer;
ob_flush();
flush();
sleep(1);
}
$status = fclose($handle);
return $status;
}
smaller files work fine, but this larger file is missing the last 2830 bytes.
i found out the issue to this. under the php.ini file, make sure you set implicit_flushing to On. i still have the explicit flush code after each line outputted however.

Categories