I got inspiration here to read line from specific line of file.
But when I tested it to get range of line from big file: I got 2 different result
Here's the benchmark result reading 100 lines from 10mb file:
Function v1 via file(): in 35ms with memory usage 12.00Mb
Function v2 via SplFileObject: in 956ms with memory usage 2.00Mb
My question, is there other way to do this so its fast like using file() but with low memory like using SplFileObject?
My current functions:
function get_line_content_range_v1($line_number_start, $line_number_end) {
$content = array();
$data = file('10mb.txt');
for($i = $line_number_start; $i <= $line_number_end; $i++) {
$content[] = $data[$i];
}
return $content;
}
function get_line_content_range_v2($line_number_start, $line_number_end) {
$content = array();
$file = new SplFileObject("10mb.txt", "r");
for($i = $line_number_start; $i <= $line_number_end; $i++) {
$file->seek($i);
$content[] = $file->current();
}
return $content;
}
Use a generator to save memory. There is no need to have all contents in RAM.
function get_line_content_range_v3($line_number_start, $line_number_end)
{
$filehandle = fopen('10mb.txt', 'r');
$line_number = 0;
while (++$line_number <= $line_number_end) {
$line = fgets($filehandle);
if ($line_number < $line_number_start) {
continue;
}
yield $line;
}
fclose($filehandle);
}
foreach (get_line_content_range_v3(12, 15) as $line) {
echo $line;
}
Related
I couldn't find a solution to this. I'm sorry if this is a silly question.
I have 4 log files and I need to remove all log except last 10 lines.
I'm able to do it for 1 file but how to apply it on 4 files using once simple php code?
My current code:
<?php
$lines_array = file("log.txt");
$lines = count($lines_array);
$new_output = "";
for ($i=$lines - 10; $i < $lines; $i++) {
$new_output .= $lines_array[$i];
}
$filename = "log.txt";
file_put_contents($filename,$new_output);
What is the best way to achieve this?
Functional programming to the rescue:
function rotate(string $filename)
{
$lines_array = file($filename);
$lines = count($lines_array);
$new_output = "";
for ($i=$lines - 10; $i < $lines; $i++) {
$new_output .= $lines_array[$i];
}
file_put_contents($filename,$new_output);
}
rotate('log1.txt');
rotate('someOtherLog.txt');
rotate('third/log/file.txt);
//etc.
// or,
$logs = [
'log1.txt',
'someOtherLog.txt',
'third/log/file.txt'
];
foreach($logs as $file) {
rotate($file);
}
This allows you to write the code for rotating your logs one time, which makes your code better by being DRY (Don’t Repeat Yourself)
List your logfiles in an array, and loop over it, rewriting the log files as you go:
$logs = [
'log1.txt',
'log2.txt',
'log3.log'
];
foreach($logs as $log) {
// Only do this if we read the file.
if ($logData = file($log)) {
// array_slice takes a portion of the array
file_put_contents($log, array_slice($logData,-10));
}
}
Please try this code:
<?php
$fp = fopen("log.txt","ab+");
$data = fread($fp,filesize("log.txt"));
$data_array = explode("\n",$data);
$new_data = array();
for($i = count($data_array) - 1; $i >= count($data_array) - 10;$i--)
{
array_push($new_data , $data_array[$i]);
}
fclose($fp);
$new_data_array = array_reverse($new_data);
$data = implode("\n",$new_data_array);
$fp = fopen("log.txt","w");
fwrite($fp,$data);
fclose($fp);
?>
I would like to read from lines 51 to 100 in a text file using PHP.
The text file contains less than 500 lines at any given time.
I want to read from lines 51 to 100. No other lines.
I hope you understand the question. I've used SO many times and love it. Just cannot find an answer to this one anywhere.
Any help will be appreciated. Thanks in advance.
Just posting another example here based off Guilherme's example :
After testing Guilherme's code on my box, I found that it was having errors pertaining to the use of feof() while his code should work (the logic is sound), it does break.
The following method (used the same way as his), should be a working drop-in replacement.
function retrieveText($file, $min, $max)
{
$output = Array();
$line = -1;
$handle = fopen($file, 'r');
while(!feof($handle)) {
$line++;
if(($line >= $min && $line <= $max)) {
$output[] = fgets($handle);
} elseif($line > $max) {
break;
} else {
fgets($handle);
}
}
fclose($handle);
return implode("\n", $output);
}
Please note, this uses a carriage return to separate each line "\n", which you can alter to "<br />" or whatever you like depending on where you will display the result.
Use while, fopen and feof (it is good for read big files 8) ), like this:
<?php
function retrieveText($file, $init, $end, $sulfix = '')
{
$i = 1;
$output = '';
$handle = fopen($file, 'r');
while (false === feof($handle) && $i <= $end) {
$data = fgets($handle);
if ($i >= $init) {
$output .= $data . $sulfix;
}
$i++;
}
fclose($handle);
return $output;
}
Example on how to use this function to get lines 51 to 100 :
echo retrieveText('myfile.txt', 51, 100);
Add break line:
echo retrieveText('myfile.txt', 51, 100, PHP_EOL);
Add break line in html:
echo retrieveText('myfile.txt', 51, 100, '<br>');
So after encountering the memory limits I copied the following code.
It worked great. My only issue now is in the context of processing a csv file, the structure limits based on the size of chunk, but this would mean it could cut a row mid way through.
What could I do to ensure when a chunk is made, that it ends up to the /n.
I'm interested to know what others do.
I changed the limit to based on a certain amount of lines to read instead of the size limit. Instead of using fread, i used fgets to get the whole line.
Once again, this is all derived from the code linked in the question.
<?php
function file_get_contents_chunked($file,$chunk_size,$callback)
{
try
{
$handle = fopen($file, "r");
$i = 0;
$x = 0;
$chunk = array();
while (!feof($handle)) {
while ($row = fgets($handle)) {
// can parse further $row by usingstr_getcsv
$x ++;
$chunk[] = $row;
if ($x == $chunk_size) {
call_user_func_array($callback, array($chunk, &$handle, $i));
unset($chunk);
$x = 0;
}
}
}
fclose($handle);
}
catch(Exception $e)
{
trigger_error("file_get_contents_chunked::" . $e->getMessage(),E_USER_NOTICE);
return false;
}
return true;
}
?>
//Fixed for what I actually intended, limit by x amount of lines
You can first try setting the memory limit with :
ini_set('memory_limit', '32MB');
Then, to read a line at once :
$handle = fopen("inputfile.csv", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
} else {
// error opening the file.
}
fclose($handle);
Based on your code example, this would give :
function file_get_contents_chunked($file,$chunk_size,$callback)
{
$handle = fopen($file, "r");
$i = 0;
if ($handle) {
while (($line = fgets($handle)) !== false) {
call_user_func_array($callback,array($line,&$handle,$i));
$i++;
}
} else {
return false;
}
fclose($handle);
return true;
}
Note that $chunk_size parameter is not needed anymore... so you can remove it if you want.
You can also use this function to read a sav file line by line :
fgetcsv(file,length,separator,enclosure);
Example 1
<?php
$file = fopen("contacts.csv","r");
print_r(fgetcsv($file));
fclose($file);
?>
The CSV file:
Kai Jim, Refsnes, Stavanger, Norway
Hege, Refsnes, Stavanger, Norway
The output of the code above will be:
Array
(
[0] => Kai Jim
[1] => Refsnes
[2] => Stavanger
[3] => Norway
)
how to remove every line except the first 20 using php from a text file?
If loading the entire file in memory is feasible you can do:
// read the file in an array.
$file = file($filename);
// slice first 20 elements.
$file = array_slice($file,0,20);
// write back to file after joining.
file_put_contents($filename,implode("",$file));
A better solution would be to use the function ftruncate which takes the file handle and the new size of the file in bytes as follows:
// open the file in read-write mode.
$handle = fopen($filename, 'r+');
if(!$handle) {
// die here.
}
// new length of the file.
$length = 0;
// line count.
$count = 0;
// read line by line.
while (($buffer = fgets($handle)) !== false) {
// increment line count.
++$count;
// if count exceeds limit..break.
if($count > 20) {
break;
}
// add the current line length to final length.
$length += strlen($buffer);
}
// truncate the file to new file length.
ftruncate($handle, $length);
// close the file.
fclose($handle);
For a memory efficient solution you can use
$file = new SplFileObject('/path/to/file.txt', 'a+');
$file->seek(19); // zero-based, hence 19 is line 20
$file->ftruncate($file->ftell());
Apologies, mis-read the question...
$filename = "blah.txt";
$lines = file($filename);
$data = "";
for ($i = 0; $i < 20; $i++) {
$data .= $lines[$i] . PHP_EOL;
}
file_put_contents($filename, $data);
Something like:
$lines_array = file("yourFile.txt");
$new_output = "";
for ($i=0; $i<20; $i++){
$new_output .= $lines_array[$i];
}
file_put_contents("yourFile.txt", $new_output);
This should work as well without huge memory usage
$result = '';
$file = fopen('/path/to/file.txt', 'r');
for ($i = 0; $i < 20; $i++)
{
$result .= fgets($file);
}
fclose($file);
file_put_contents('/path/to/file.txt', $result);
I'm not sure if this is possible, I've been googling for a solution... But, essentially, I have a very large file, the lines of which I want to store in an array. Thus, I'm using file(), but is there a way to do that in batches? So that every,say, 100 lines it creates, it "pauses"?
I think there's likely to be something I can do with a foreach loop or something, but I'm not sure that I'm thinking about it the right way...
Like
$i=0;
$j=0;
$throttle=100;
foreach($files as $k => $v) {
if($i < $j+$throttle && $i > $j) {
$lines[] = file($v);
//Do some other stuff, like importing into a db
}
$i++;
$j++;
}
But, I think that won't really work because $i & $j will always be equal... Anyway, feeling muddled... Can someone help me think a lil' clearer?
Read the file in line by line for however many lines you need, appending each line to an array. When the array gets to the desired length, process it, and empty the array. E.g.:
$handle = #fopen("/tmp/inputfile.txt", "r");
$throttle = 100;
$data = array();
if ($handle) {
while(!feof($handle)) {
$buffer = fgets($handle, 4096);
$data[] = $buffer;
if(count($data) == $throttle) {
doSomething($data);
$data = array();
}
}
fclose($handle);
}
You never incremented $i or $j... What you can do, is something like:
$data = array();
$chunk = 100;
$f = fopen($file, 'r');
while (!feof($f)) {
for ($i = 0; $i < $chunk; $i++) {
$tmp = fgets($f);
if ($tmp !== false) {
$data[] = $tmp;
} else {
//No more data, break out of the inner loop
break;
}
}
//Process your data
$data = array();
}
fclose($f);
If by "pause", you mean that you really want to pause execution of your script, use sleep or some of its variants: http://php.net/manual/en/function.sleep.php