How can I read 30MB text file using php script? - php

Hi I do have a text file with size of upto 30MB I would like to read this file using PHP loop script
$lines = file('data.txt');
//loop through each line
foreach ($lines as $line) { \\some function }
Is there any way? I want open it for reading php doesnt allow me to open a 30MB file.

You could read it line by line like this:
$file = fopen("data.txt", "r") or exit("Unable to open file!");
while(!feof($file)) {
// do what you need to do with it - just echoing it out for this example
echo fgets($file). "<br />";
}
fclose($file);

Read line by line using:
$handle = fopen ( "data.txt", "r" );
while ( ( $buffer = fgets ( $handle, 4096 ) ) !== false ) {
// your function on line;
}

If it is suitable for you to read the file piece-by-piece you can try something like this
$fd = fopen("fileName", "r");
while (!feof($fd)) {
$buffer = fread($fd, 16384); //You can change the size of the buffer according to the memory you can youse
//Process here the buffer, piece by piece
}
fclose($fd);

Related

How can I create an array by parsing a large file? [duplicate]

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

php://temp copy it to a file?

I have some code that create a tmp file using php://temp ...
For debugging purpose and learning as well, I would like to save/copy into a defined file.
So there is any easy way to do that ?
You can use stream_get_contents or fwrite when working with large files
//Somecode wrting to temp
$tmp = fopen('php://temp', 'r+');
fwrite($tmp, 'test');
rewind($tmp);
// Read and save to log.txt
file_put_contents("log.txt",stream_get_contents($tmp));
Large File Implementation
set_time_limit(0);
$file = "log.txt";
$final = fopen($file, "w+");
while ( ! feof($tmp) ) {
fwrite($final, fgets($tmp));
}
fclose($tmp);
fclose($final);

How to read a large file line by line?

I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}

PHP: readfile() has been disabled for security reasons

I wrote a php script which outputs html files on the screen which uses readfile($htmlFile);
however in the web-hosting that I have purchased the readfile() has been disabled for security reasons.
Is there any substitution ( other php functions) for the readfile() or I have no choice but to ask the admin to enable it for me?
Thanks
You can check which functions are disabled by using:
var_dump(ini_get('disable_functions'));
You can try to use fopen() and fread() instead:
http://nl2.php.net/manual/en/function.fopen.php
http://nl2.php.net/manual/en/function.fread.php
$file = fopen($filename, 'rb');
if ( $file !== false ) {
while ( !feof($file) ) {
echo fread($file, 4096);
}
fclose($file);
}
Or fopen() with fpassthru()
$file = fopen($filename, 'rb');
if ( $file !== false ) {
fpassthru($file);
fclose($file);
}
Alternatively you can use fwrite() to write content.
You can also try to use file_get_contents()
http://nl2.php.net/file_get_contents
Or you can use file()
http://nl2.php.net/manual/en/function.file.php
I wouldn't recommend this method though, but if nothing works...
$data = file($filename);
if ( $data !== false ) {
echo implode('', $data);
}
If its disabled then you could do something like following as alternative:
$file = fopen($yourFileNameHere, 'rb');
if ( $file !== false ) {
while ( !feof($file) ) {
echo fread($file, 4096);
}
fclose($file);
}
//OR
$contents = file_get_contents($yourFileNameHere); //if for smaller files
Hope it helps
You can try :
$path = '/some/path/to/file.html';
$file_string = '';
$file_content = file($path);
// here is the loop
foreach ($file_content as $row) {
$file_string .= $row;
}
// finally print it
echo $file_string;

How do I prepend file to beginning?

In PHP if you write to a file it will write end of that existing file.
How do we prepend a file to write in the beginning of that file?
I have tried rewind($handle) function but seems overwriting if current content is larger than existing.
Any Ideas?
$prepend = 'prepend me please';
$file = '/path/to/file';
$fileContents = file_get_contents($file);
file_put_contents($file, $prepend . $fileContents);
The file_get_contents solution is inefficient for large files. This solution may take longer, depending on the amount of data that needs to be prepended (more is actually better), but it won't eat up memory.
<?php
$cache_new = "Prepend this"; // this gets prepended
$file = "file.dat"; // the file to which $cache_new gets prepended
$handle = fopen($file, "r+");
$len = strlen($cache_new);
$final_len = filesize($file) + $len;
$cache_old = fread($handle, $len);
rewind($handle);
$i = 1;
while (ftell($handle) < $final_len) {
fwrite($handle, $cache_new);
$cache_new = $cache_old;
$cache_old = fread($handle, $len);
fseek($handle, $i * $len);
$i++;
}
?>
$filename = "log.txt";
$file_to_read = #fopen($filename, "r");
$old_text = #fread($file_to_read, 1024); // max 1024
#fclose(file_to_read);
$file_to_write = fopen($filename, "w");
fwrite($file_to_write, "new text".$old_text);
Another (rough) suggestion:
$tempFile = tempnam('/tmp/dir');
$fhandle = fopen($tempFile, 'w');
fwrite($fhandle, 'string to prepend');
$oldFhandle = fopen('/path/to/file', 'r');
while (($buffer = fread($oldFhandle, 10000)) !== false) {
fwrite($fhandle, $buffer);
}
fclose($fhandle);
fclose($oldFhandle);
rename($tempFile, '/path/to/file');
This has the drawback of using a temporary file, but is otherwise pretty efficient.
When using fopen() you can set the mode to set the pointer (ie. the begginng or end.
$afile = fopen("file.txt", "r+");
'r' Open for reading only; place
the file pointer at the beginning of
the file.
'r+' Open for reading and
writing; place the file pointer at the
beginning of the file.
$file = fopen('filepath.txt', 'r+') or die('Error');
$txt = "/n".$string;
fwrite($file, $txt);
fclose($file);
This will add a blank line in the text file, so next time you write to it you replace the blank line. with a blank line and your string.
This is the only and best trick.

Categories