Can't fgets() a 1 gig file? - php

Code:
$filename = 'Master_List_DeDuped.csv';
$fp = fopen($filename, "r");
while (false !== ($line = fgets($fp))) {
echo $line;
die(); // For Debugging only
}
fclose($fp);
The resulting error:
Warning: fgets(): 3 is not a valid stream resource in /home3/public_html/index.php on line 288
Line 288 is the while statement. The same commands work fine with a smaller file. My file is about 1.1 gigs. Is it just a file size limitation?
Edit: I've tried adding the length parameter to fgets, but the same error shows. http://us2.php.net/fgets

Changed the code a bit based on the example at php.net http://us2.php.net/fgets . The code that works is:
$filename = 'Master_List_DeDuped.csv';
$fp = #fopen($filename, "r");
if ($fp) {
while (($line = fgets($fp, 4096)) !== false) {
echo $line;
die(); // For Debugging only
}
}
fclose($fp);

Related

How can I create an array by parsing a large file? [duplicate]

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

PHP - Memory Limit. Trying to read large file (72mb)

I have a script processing large text files.
I am limited however by the size of the files.
I've done some searching on this forum, and i've come to the conclusion that i must process the file line by line, however this brings up quite a bit of issues for me, as i need some detailed info from the file, before i can start processing it.
I've tried adding each line to a variable as below:
$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 256);
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
And just as expected, it did not work.
Could anyone help me out ?
add this on top of your file
ini_set("memory_limit", -1);
Be careful, the page can use all the RAM on the server
$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (($buffer = fgets($handle, 4096))!== false) {
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
OR
$content = "";
$target_File ="/tmp/uploadfile.txt";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
// Process buffer here..
//$content .= $buffer;
}
fclose($handle);
}
NOTE:
If the problem is caused by hitting the memory limit, you can try setting it a higher value (this could work or not depending on php's configuration).
this sets the memory limit to 32 Mb
ini_set("memory_limit","32M");

PHP : failed to open stream error when reading file name from file

I have a file input.txt that contains file names that I need to open and read data. I have written the following php code and I get the failed to open stream: No such file or directory when it tries to open with variable $files, i.e., the second fopen is failing.
$handle = fopen("/home/user/input.txt", "r");
if($handle) {
while(($files = fgets($handle)) !== false) {
print $files;
$filename = fopen($files,"r");
print $filename;
}
}
input.txt content:
/home/user/file_1
/home/user/file_2
/home/user/file_3
/home/user/file_4
file_1,file_2,file_3 and file_4 are in /home/user/
I am not sure what I am doing wrong.
My guess is that the file lines contains whitespaces (e.g. \r), to remove them we'll use trim()
function open_files_from_file_list()
{
$handle = fopen("/home/user/input.txt", "r");
if(!$handle)
return;
while(($line = fgets($handle)) !== false)
{
$line=trim($line);
print $line;
if (!file_exists($line))
{
print ' does not exists';
continue;
}
$filename = fopen($line,"r");
print $filename;
}
}

Improve performance for reading CSV file from ZIP?

Do you have any idea how to improve the performance by reading CSV file from a zip file?
Firstly it open the zip file, then put the data into a memory and then read it by fgetcsv
$zip = new ZipArchive();
if ($zip->open($fileName)) {
$info = $zip->statIndex(0);
$fp = $zip->getStream($info['name']);
if(!$fp) exit("failed\n");
while (!feof($fp)) {
$contents .= fread($fp, 2);
}
fclose($fp);
$zip->close();
}
$temp = fopen("php://memory", "rw");
fwrite($temp, $contents);
fseek($temp, 0);
while (($data = fgetcsv($temp, 0)) !== false) {
....
}
Quick check with php manual showed that this should work:
<?php
$fp = fopen('zip://test.zip#test', 'r'); // test name of file in archive
if (!$fp) {
exit("cannot open\n");
}
while (($data = fgetcsv($fp, 0)) !== false) {
...
}
fclose($fp);

How to read a large file line by line?

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

Categories