How to move file pointer to previous line in php? - php

Text file in question is named fp.txt and contains 01, 02, 03, 04, 05, ...10 on each line.
01
02
...
10
Code:
<?php
//test file for testing fseek etc
$file = "fp.txt";
$fp = fopen($file, "r+") or die("Couldn't open ".$file);
$count = 0;
while(!(feof($fp))){ // till the end of file
$text = fgets($fp, 1024);
$count++;
$dice = rand(1,2); // just to make/alter the if condition randomly
echo "Dice=".$dice." Count=".$count." Text=".$text."<br />";
if ($dice == 1){
fseek($fp, -1024, SEEK_CUR);
}
}
fclose($fp);
?>
So, because of fseek($fp, -1024, SEEK_CUR); is not working properly. What I want is that If Dice == 1, set the file-pointer to previous line i.e. one line up than the current one. But I think negative value is setting the file pointer to end of file, and thus ending the while loop before the actual end of file.
Desired Output is:
Dice=2 Count=1 Text=01
Dice=2 Count=2 Text=02
Dice=2 Count=3 Text=03
Dice=1 Count=4 Text=03
Dice=2 Count=5 Text=04
Dice=2 Count=6 Text=05
Dice=2 Count=7 Text=06
Dice=1 Count=8 Text=06
Dice=1 Count=9 Text=06
Dice=2 Count=10 Text=07
.... //and so on until Text is 10 (Last Line)
Dice=2 Count=n Text=10
Note that whenever dice is 2, text is same as previous one. Now it is just stopping at the first occurrence of Dice=1
So basically my question is How to move/relocate file-pointer to previous line?
Please note that dice=rand(1,2) is just for example. In the actual code, $text is a string and if condition is to be true when string does not contain a particular text.
EDIT:
Solved, both samples (#hakre 's & mine) are working as desired.

You read out a line from the file, but only forward to the next line when the dice is not 1.
Consider to use the SplFileObject for that, which offers an interface that is better for your scenario I'd say:
$file = new SplFileObject("fp.txt");
$count = 0;
$file->rewind();
while ($file->valid())
{
$count++;
$text = $file->current();
$dice = rand(1,2); // just to make alter the if condition randomly
echo "Dice=".$dice." Count=".$count." Text=".$text."<br />";
if ($dice != 1)
{
$file->next();
}
}

<?php
$file = "fp.txt";
$fp = fopen($file, "r+") or die("Couldn't open ".$file);
$eof = FALSE; //end of file status
$count = 0;
while(!(feof($fp))){ // till the end of file
$current = ftell($fp);
$text = fgets($fp, 1024);
$count++;
$dice = rand(1,2); // just to alter the if condition randomly
if ($dice == 2){
fseek($fp, $current, SEEK_SET);
}
echo "Dice=".$dice." Count=".$count." Text=".$text."<br />";
}
fclose($fp);
?>
This sample is also working as required.
The changes are:
* Addition of "$current = ftell($fp);" after while loop.
* Modification of fseek line in if condition.
* checking for dice==2 instead of dice==1

Related

How can I read newly appended lines from a LARGE (4GB+) open file?

Using PHP 7.3, I'm trying to achieve "tail -f" functionality: open a file, waiting for some other process to write to it, then read those new lines.
Unfortunately, it seems that fgets() caches the EOF condition. Even when there's new data available (filemtime changes), fgets() returns a blank line.
The important part: I cannot simply close, reopen, then seek, because the file size is tens of gigs in size, well above the 32 bit limit. The file must stay open in order to be able to read new data from the correct position.
I've attached some code to demonstrate the problem. If you append data to the input file, filemtime() detects the change, but fgets() reads nothing new.
fread() does seem to work, picking up the new data but I'd rather not have to come up with a roll-your-own "read a line" solution.
Does anyone know how I might be able to poke fgets() into realising that it's not the EOF?
$fn = $argv[1];
$fp = fopen($fn, "r");
fseek($fp, -1000, SEEK_END);
$filemtime = 0;
while (1) {
if (feof($fp)) {
echo "got EOF\n";
sleep(1);
clearstatcache();
$tmp = filemtime($fn);
if ($tmp != $filemtime) {
echo "time $filemtime -> $tmp\n";
$filemtime = $tmp;
}
}
$l = trim(fgets($fp, 8192));
echo "l=$l\n";
}
Update: I tried excluding the call to feof (thinking that may be where the state becomes cached) but the behaviour doesn't change; once fgets reaches the original file pointer position, any further fgets reads will return false, even if more data is subsequently appended.
Update 2: I ended up rolling my own function that will continue returning new data after the first EOF is reached (in fact, it has no concept of EOF, just data available / data not available). Code not heavily tested, so use at your own risk. Hope this helps someone else.
*** NOTE this code was updated 20th June 2021 to fix an off-by-one error. The comment "includes line separator" was incorrect up to this point.
define('FGETS_TAIL_CHUNK_SIZE', 4096);
define('FGETS_TAIL_SANITY', 65536);
define('FGETS_TAIL_LINE_SEPARATOR', 10);
function fgets_tail($fp) {
// Get complete line from open file which may have additional data written to it.
// Returns string (including line separator) or FALSE if there is no line available (buffer does not have complete line, or is empty because of EOF)
global $fgets_tail_buf;
if (!isset($fgets_tail_buf)) $fgets_tail_buf = "";
if (strlen($fgets_tail_buf) < FGETS_TAIL_CHUNK_SIZE) { // buffer not full, attempt to append data to it
$t = fread($fp, FGETS_TAIL_CHUNK_SIZE);
if ($t != false) $fgets_tail_buf .= $t;
}
$ptr = strpos($fgets_tail_buf, chr(FGETS_TAIL_LINE_SEPARATOR));
if ($ptr !== false) {
$rv = substr($fgets_tail_buf, 0, $ptr + 1); // includes line separator
$fgets_tail_buf = substr($fgets_tail_buf, $ptr + 1); // may reduce buffer to empty
return($rv);
} else {
if (strlen($fgets_tail_buf) < FGETS_TAIL_SANITY) { // line separator not found, try to append some more data
$t = fread($fp, FGETS_TAIL_CHUNK_SIZE);
if ($t != false) $fgets_tail_buf .= $t;
}
}
return(false);
}
The author found the solution himself how to create PHP tail viewer for gians log files 4+ Gb in size.
To mark this question as replied, I summary the solution:
define('FGETS_TAIL_CHUNK_SIZE', 4096);
define('FGETS_TAIL_SANITY', 65536);
define('FGETS_TAIL_LINE_SEPARATOR', 10);
function fgets_tail($fp) {
// Get complete line from open file which may have additional data written to it.
// Returns string (including line separator) or FALSE if there is no line available (buffer does not have complete line, or is empty because of EOF)
global $fgets_tail_buf;
if (!isset($fgets_tail_buf)) $fgets_tail_buf = "";
if (strlen($fgets_tail_buf) < FGETS_TAIL_CHUNK_SIZE) { // buffer not full, attempt to append data to it
$t = fread($fp, FGETS_TAIL_CHUNK_SIZE);
if ($t != false) $fgets_tail_buf .= $t;
}
$ptr = strpos($fgets_tail_buf, chr(FGETS_TAIL_LINE_SEPARATOR));
if ($ptr !== false) {
$rv = substr($fgets_tail_buf, 0, $ptr + 1); // includes line separator
$fgets_tail_buf = substr($fgets_tail_buf, $ptr + 1); // may reduce buffer to empty
return($rv);
} else {
if (strlen($fgets_tail_buf) < FGETS_TAIL_SANITY) { // line separator not found, try to append some more data
$t = fread($fp, FGETS_TAIL_CHUNK_SIZE);
if ($t != false) $fgets_tail_buf .= $t;
}
}
return(false);
}

add string to file after a specific line

I would like to know is there is a way to add string to a file after a specific line
in php?
I have tried
file_put_contents
but it puts the string at the end of the file.
Thanks for the help.
Has been a long time but will be useful to anyone who come across this in future...
$f = fopen("path/to/file", "r+");
$oldstr = file_get_contents("path/to/file");
$str_to_insert = "Write the string to insert here";
$specificLine = "Specify the line here";
// read lines with fgets() until you have reached the right one
//insert the line and than write in the file.
while (($buffer = fgets($f)) !== false) {
if (strpos($buffer, $specificLine) !== false) {
$pos = ftell($f);
$newstr = substr_replace($oldstr, $str_to_insert, $pos, 0);
file_put_contents("path/to/file", $newstr);
break;
}
}
fclose($f);
This is one approach, kinda verbose, but makes all modifications inline:
$f = fopen("test.txt", "tr+");
// read lines with fgets() until you have reached the right one
$pos = ftell($f); // save current position
$trailer = stream_get_contents($f); // read trailing data
fseek($f, $pos); // go back
ftruncate($f, $pos); // truncate the file at current position
fputs($f, "my strings\n"); // add line
fwrite($f, $trailer); // restore trailing data
If the file is particularly big, you would need an intermediate file.
Also one way is to use file() function. In returns an array of the contents of that particular file per line. From there, you can manipulate the array and append that value on that specific line. Consider this example:
// Sample file content (original)
// line 1
// line 2
// line 3
// line 4
// line 5
// line 6
$replacement = "Hello World";
$specific_line = 3; // sample value squeeze it on this line
$contents = file('file.txt', FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
if($specific_line > sizeof($contents)) {
$specific_line = sizeof($contents) + 1;
}
array_splice($contents, $specific_line-1, 0, array($replacement)); // arrays start at zero index
$contents = implode("\n", $contents);
file_put_contents('file.txt', $contents);
// Sample output
// line 1
// line 2
// Hello World
// line 3
// line 4
// line 5
// line 6
The following is my code
function doit($search,$file,$insert)
{
$array = explode("\n", file_get_contents($file));
$max=count($array);
for($a=0;$a<$max;$a++)
{if($array[$a]==$search) {
$array = array_slice($array, 0, $a+1, true) +
array($insert) +
array_slice($array, $a+1);
break;}}
$myfile = fopen($file, "w");
$max=count($array);
var str='';
for($a=0;$a<$max;$a++)
{str.=$array[$a].'\n';}
fclose($myfile);
}
You have to give the filepath ($file), the new line text ($insert) and the text of the line($search) after which the new line is to be inserted

Tailing Log File and Write results to new file

I'm not sure how to word this so I'll type it out and then edit and answer any questions that come up..
Currently on my local network device (PHP4 based) I'm using this to tail a live system log file: http://commavee.com/2007/04/13/ajax-logfile-tailer-viewer/
This works well and every 1 second it loads an external page (logfile.php) that does a tail -n 100 logfile.log The script doesn't do any buffering so the results it displayes onscreen are the last 100 lines from the log file.
The logfile.php contains :
<? // logtail.php $cmd = "tail -10 /path/to/your/logs/some.log"; exec("$cmd 2>&1", $output);
foreach($output as $outputline) {
echo ("$outputline\n");
}
?>
This part is working well.
I have adapted the logfile.php page to write the $outputline to a new text file, simply using fwrite($fp,$outputline."\n");
Whilst this works I am having issues with duplication in the new file that is created.
Obviously each time tail -n 100 is run produces results, the next time it runs it could produce some of the same lines, as this repeats I can end up with multiple lines of duplication in the new text file.
I can't directly compare the line I'm about to write to previous lines as there could be identical matches.
Is there any way I can compare this current block of 100 lines with the previous block and then only write the lines that are not matching.. Again possible issue that block A & B will contain identical lines that are needed...
Is it possible to update logfile.php to note the position it last tooked at in my logfile and then only read the next 100 lines from there and write those to the new file ?
The log file could be upto 500MB so I don't want to read it all in each time..
Any advice or suggestions welcome..
Thanks
UPDATE # 16:30
I've sort of got this working using :
$file = "/logs/syst.log";
$handle = fopen($file, "r");
if(isset($_SESSION['ftell'])) {
clearstatcache();
fseek($handle, $_SESSION['ftell']);
while ($buffer = fgets($handle)) {
echo $buffer."<br/>";
#ob_flush(); #flush();
}
fclose($handle);
#$_SESSION['ftell'] = ftell($handle);
} else {
fseek($handle, -1024, SEEK_END);
fclose($handle);
#$_SESSION['ftell'] = ftell($handle);
}
This seems to work, but it loads the entire file first and then just the updates.
How would I get it start with the last 50 lines and then just the updates ?
Thanks :)
UPDATE 04/06/2013
Whilst this works it's very slow with large files.
I've tried this code and it seems faster, but it doesn't just read from where it left off.
function last_lines($path, $line_count, $block_size = 512){
$lines = array();
// we will always have a fragment of a non-complete line
// keep this in here till we have our next entire line.
$leftover = "";
$fh = fopen($path, 'r');
// go to the end of the file
fseek($fh, 0, SEEK_END);
do{
// need to know whether we can actually go back
// $block_size bytes
$can_read = $block_size;
if(ftell($fh) < $block_size){
$can_read = ftell($fh);
}
// go back as many bytes as we can
// read them to $data and then move the file pointer
// back to where we were.
fseek($fh, -$can_read, SEEK_CUR);
$data = fread($fh, $can_read);
$data .= $leftover;
fseek($fh, -$can_read, SEEK_CUR);
// split lines by \n. Then reverse them,
// now the last line is most likely not a complete
// line which is why we do not directly add it, but
// append it to the data read the next time.
$split_data = array_reverse(explode("\n", $data));
$new_lines = array_slice($split_data, 0, -1);
$lines = array_merge($lines, $new_lines);
$leftover = $split_data[count($split_data) - 1];
}
while(count($lines) < $line_count && ftell($fh) != 0);
if(ftell($fh) == 0){
$lines[] = $leftover;
}
fclose($fh);
// Usually, we will read too many lines, correct that here.
return array_slice($lines, 0, $line_count);
}
Any way this can be amend so it will read from the last known position.. ?
Thanks
Introduction
You can tail a file by tracking the last position;
Example
$file = __DIR__ . "/a.log";
$tail = new TailLog($file);
$data = $tail->tail(100) ;
// Save $data to new file
TailLog is a simple class i wrote for this task here is a simple example to show its actually tailing the file
Simple Test
$file = __DIR__ . "/a.log";
$tail = new TailLog($file);
// Some Random Data
$data = array_chunk(range("a", "z"), 3);
// Write Log
file_put_contents($file, implode("\n", array_shift($data)));
// First Tail (2) Run
print_r($tail->tail(2));
// Run Tail (2) Again
print_r($tail->tail(2));
// Write Another data to Log
file_put_contents($file, "\n" . implode("\n", array_shift($data)), FILE_APPEND);
// Call Tail Again after writing Data
print_r($tail->tail(2));
// See the full content
print_r(file_get_contents($file));
Output
// First Tail (2) Run
Array
(
[0] => c
[1] => b
)
// Run Tail (2) Again
Array
(
)
// Call Tail Again after writing Data
Array
(
[0] => f
[1] => e
)
// See the full content
a
b
c
d
e
f
Real Time Tailing
while(true) {
$data = $tail->tail(100);
// write data to another file
sleep(5);
}
Note: Tailing 100 lines does not mean it would always return 100 lines. It would return new lines added 100 is just the maximum number of lines to return. This might not be efficient where you have heavy logging of more than 100 line per sec is there is any
Tail Class
class TailLog {
private $file;
private $data;
private $timeout = 5;
private $lock;
function __construct($file) {
$this->file = $file;
$this->lock = new TailLock($file);
}
public function tail($lines) {
$pos = - 2;
$t = $lines;
$fp = fopen($this->file, "r");
$break = false;
$line = "";
$text = array();
while($t > 0) {
$c = "";
// Seach for End of line
while($c != "\n" && $c != PHP_EOL) {
if (fseek($fp, $pos, SEEK_END) == - 1) {
$break = true;
break;
}
if (ftell($fp) < $this->lock->getPosition()) {
break;
}
$c = fgetc($fp);
$pos --;
}
if (ftell($fp) < $this->lock->getPosition()) {
break;
}
$t --;
$break && rewind($fp);
$text[$lines - $t - 1] = fgets($fp);
if ($break) {
break;
}
}
// Move to end
fseek($fp, 0, SEEK_END);
// Save Position
$this->lock->save(ftell($fp));
// Close File
fclose($fp);
return array_map("trim", $text);
}
}
Tail Lock
class TailLock {
private $file;
private $lock;
private $data;
function __construct($file) {
$this->file = $file;
$this->lock = $file . ".tail";
touch($this->lock);
if (! is_file($this->lock))
throw new Exception("can't Create Lock File");
$this->data = json_decode(file_get_contents($this->lock));
// Check if file is valida json
// Check if Data in the original files as not be delete
// You expect data to increate not decrease
if (! $this->data || $this->data->size > filesize($this->file)) {
$this->reset($file);
}
}
function getPosition() {
return $this->data->position;
}
function reset() {
$this->data = new stdClass();
$this->data->size = filesize($this->file);
$this->data->modification = filemtime($this->file);
$this->data->position = 0;
$this->update();
}
function save($pos) {
$this->data = new stdClass();
$this->data->size = filesize($this->file);
$this->data->modification = filemtime($this->file);
$this->data->position = $pos;
$this->update();
}
function update() {
return file_put_contents($this->lock, json_encode($this->data, 128));
}
}
Not really clear on how you want to use the output but would something like this work ....
$dat = file_get_contents("tracker.dat");
$fp = fopen("/logs/syst.log", "r");
fseek($fp, $dat, SEEK_SET);
ob_start();
// alternatively you can do a while fgets if you want to interpret the file or do something
fpassthru($fp);
$pos = ftell($fp);
fclose($fp);
echo nl2br(ob_get_clean());
file_put_contents("tracker.dat", ftell($fp));
tracker.dat is just a text file that contains where the read position position was from the previous run. I'm just seeking to that position and piping the rest to the output buffer.
Use tail -c <number of bytes, instead of number of lines, and then check the file size. The rough idea is:
$old_file_size = 0;
$max_bytes = 512;
function last_lines($path) {
$new_file_size = filesize($path);
$pending_bytes = $new_file_size - $old_file_size;
if ($pending_bytes > $max_bytes) $pending_bytes = $max_bytes;
exec("tail -c " + $pending_bytes + " /path/to/your_log", $output);
$old_file_size = $new_file_size;
return $output;
}
The advantage is that you can do away with all the special processing stuff, and get good performance. The disadvantage is that you have to manually split the output into lines, and probably you could end up with unfinished lines. But this isn't a big deal, you can easily work around by omitting the last line alone from the output (and appropriately subtracting the last line number of bytes from old_file_size).

Reading large files from end

Can I read a file in PHP from my end, for example if I want to read last 10-20 lines?
And, as I read, if the size of the file is more than 10mbs I start getting errors.
How can I prevent this error?
For reading a normal file, we use the code :
if ($handle) {
while (($buffer = fgets($handle, 4096)) !== false) {
$i1++;
$content[$i1]=$buffer;
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
My file might go over 10mbs, but I just need to read the last few lines. How do I do it?
Thanks
You can use fopen and fseek to navigate in file backwards from end. For example
$fp = #fopen($file, "r");
$pos = -2;
while (fgetc($fp) != "\n") {
fseek($fp, $pos, SEEK_END);
$pos = $pos - 1;
}
$lastline = fgets($fp);
It's not pure PHP, but the common solution is to use the tac command which is the revert of cat and loads the file in reverse. Use exec() or passthru() to run it on the server and then read the results. Example usage:
<?php
$myfile = 'myfile.txt';
$command = "tac $myfile > /tmp/myfilereversed.txt";
exec($command);
$currentRow = 0;
$numRows = 20; // stops after this number of rows
$handle = fopen("/tmp/myfilereversed.txt", "r");
while (!feof($handle) && $currentRow <= $numRows) {
$currentRow++;
$buffer = fgets($handle, 4096);
echo $buffer."<br>";
}
fclose($handle);
?>
It depends how you interpret "can".
If you wonder whether you can do this directly (with PHP function) without reading the all the preceding lines, then the answer is: No, you cannot.
A line ending is an interpretation of the data and you can only know where they are, if you actually read the data.
If it is a really big file, I'd not do that though.
It would be better if you were to scan the file starting from the end, and gradually read blocks from the end to the file.
Update
Here's a PHP-only way to read the last n lines of a file without reading through all of it:
function last_lines($path, $line_count, $block_size = 512){
$lines = array();
// we will always have a fragment of a non-complete line
// keep this in here till we have our next entire line.
$leftover = "";
$fh = fopen($path, 'r');
// go to the end of the file
fseek($fh, 0, SEEK_END);
do{
// need to know whether we can actually go back
// $block_size bytes
$can_read = $block_size;
if(ftell($fh) < $block_size){
$can_read = ftell($fh);
}
// go back as many bytes as we can
// read them to $data and then move the file pointer
// back to where we were.
fseek($fh, -$can_read, SEEK_CUR);
$data = fread($fh, $can_read);
$data .= $leftover;
fseek($fh, -$can_read, SEEK_CUR);
// split lines by \n. Then reverse them,
// now the last line is most likely not a complete
// line which is why we do not directly add it, but
// append it to the data read the next time.
$split_data = array_reverse(explode("\n", $data));
$new_lines = array_slice($split_data, 0, -1);
$lines = array_merge($lines, $new_lines);
$leftover = $split_data[count($split_data) - 1];
}
while(count($lines) < $line_count && ftell($fh) != 0);
if(ftell($fh) == 0){
$lines[] = $leftover;
}
fclose($fh);
// Usually, we will read too many lines, correct that here.
return array_slice($lines, 0, $line_count);
}
Following snippet worked for me.
$file = popen("tac $filename",'r');
while ($line = fgets($file)) {
echo $line;
}
Reference: http://laughingmeme.org/2008/02/28/reading-a-file-backwards-in-php/
If your code is not working and reporting an error you should include the error in your posts!
The reason you are getting an error is because you are trying to store the entire contents of the file in PHP's memory space.
The most effiicent way to solve the problem would be as Greenisha suggests and seek to the end of the file then go back a bit. But Greenisha's mecanism for going back a bit is not very efficient.
Consider instead the method for getting the last few lines from a stream (i.e. where you can't seek):
while (($buffer = fgets($handle, 4096)) !== false) {
$i1++;
$content[$i1]=$buffer;
unset($content[$i1-$lines_to_keep]);
}
So if you know that your max line length is 4096, then you would:
if (4096*lines_to_keep<filesize($input_file)) {
fseek($fp, -4096*$lines_to_keep, SEEK_END);
}
Then apply the loop I described previously.
Since C has some more efficient methods for dealing with byte streams, the fastest solution (on a POSIX/Unix/Linux/BSD) system would be simply:
$last_lines=system("last -" . $lines_to_keep . " filename");
For Linux you can do
$linesToRead = 10;
exec("tail -n{$linesToRead} {$myFileName}" , $content);
You will get an array of lines in $content variable
Pure PHP solution
$f = fopen($myFileName, 'r');
$maxLineLength = 1000; // Real maximum length of your records
$linesToRead = 10;
fseek($f, -$maxLineLength*$linesToRead, SEEK_END); // Moves cursor back from the end of file
$res = array();
while (($buffer = fgets($f, $maxLineLength)) !== false) {
$res[] = $buffer;
}
$content = array_slice($res, -$linesToRead);
If you know about how long the lines are, you can avoid a lot of the black magic and just grab a chunk of the end of the file.
I needed the last 15 lines from a very large log file, and altogether they were about 3000 characters. So I just grab the last 8000 bytes to be safe, then read the file as normal and take what I need from the end.
$fh = fopen($file, "r");
fseek($fh, -8192, SEEK_END);
$lines = array();
while($lines[] = fgets($fh)) {}
This is possibly even more efficient than the highest rated answer, which reads the file character by character, compares each character, and splits based on newline characters.
Here is another solution. It doesn't have line length control in fgets(), you can add it.
/* Read file from end line by line */
$fp = fopen( dirname(__FILE__) . '\\some_file.txt', 'r');
$lines_read = 0;
$lines_to_read = 1000;
fseek($fp, 0, SEEK_END); //goto EOF
$eol_size = 2; // for windows is 2, rest is 1
$eol_char = "\r\n"; // mac=\r, unix=\n
while ($lines_read < $lines_to_read) {
if (ftell($fp)==0) break; //break on BOF (beginning...)
do {
fseek($fp, -1, SEEK_CUR); //seek 1 by 1 char from EOF
$eol = fgetc($fp) . fgetc($fp); //search for EOL (remove 1 fgetc if needed)
fseek($fp, -$eol_size, SEEK_CUR); //go back for EOL
} while ($eol != $eol_char && ftell($fp)>0 ); //check EOL and BOF
$position = ftell($fp); //save current position
if ($position != 0) fseek($fp, $eol_size, SEEK_CUR); //move for EOL
echo fgets($fp); //read LINE or do whatever is needed
fseek($fp, $position, SEEK_SET); //set current position
$lines_read++;
}
fclose($fp);
Well while searching for the same thing, I can across the following and thought it might be useful to others as well so sharing it here:
/* Read file from end line by line */
function tail_custom($filepath, $lines = 1, $adaptive = true) {
// Open file
$f = #fopen($filepath, "rb");
if ($f === false) return false;
// Sets buffer size, according to the number of lines to retrieve.
// This gives a performance boost when reading a few lines from the file.
if (!$adaptive) $buffer = 4096;
else $buffer = ($lines < 2 ? 64 : ($lines < 10 ? 512 : 4096));
// Jump to last character
fseek($f, -1, SEEK_END);
// Read it and adjust line number if necessary
// (Otherwise the result would be wrong if file doesn't end with a blank line)
if (fread($f, 1) != "\n") $lines -= 1;
// Start reading
$output = '';
$chunk = '';
// While we would like more
while (ftell($f) > 0 && $lines >= 0) {
// Figure out how far back we should jump
$seek = min(ftell($f), $buffer);
// Do the jump (backwards, relative to where we are)
fseek($f, -$seek, SEEK_CUR);
// Read a chunk and prepend it to our output
$output = ($chunk = fread($f, $seek)) . $output;
// Jump back to where we started reading
fseek($f, -mb_strlen($chunk, '8bit'), SEEK_CUR);
// Decrease our line counter
$lines -= substr_count($chunk, "\n");
}
// While we have too many lines
// (Because of buffer size we might have read too many)
while ($lines++ < 0) {
// Find first newline and remove all text before that
$output = substr($output, strpos($output, "\n") + 1);
}
// Close file and return
fclose($f);
return trim($output);
}
As Einstein said every thing should be made as simple as possible but no simpler. At this point you are in need of a data structure, a LIFO data structure or simply put a stack.
A more complete example of the "tail" suggestion above is provided here. This seems to be a simple and efficient method -- thank-you. Very large files should not be an issue and a temporary file is not required.
$out = array();
$ret = null;
// capture the last 30 files of the log file into a buffer
exec('tail -30 ' . $weatherLog, $buf, $ret);
if ( $ret == 0 ) {
// process the captured lines one at a time
foreach ($buf as $line) {
$n = sscanf($line, "%s temperature %f", $dt, $t);
if ( $n > 0 ) $temperature = $t;
$n = sscanf($line, "%s humidity %f", $dt, $h);
if ( $n > 0 ) $humidity = $h;
}
printf("<tr><th>Temperature</th><td>%0.1f</td></tr>\n",
$temperature);
printf("<tr><th>Humidity</th><td>%0.1f</td></tr>\n", $humidity);
}
else { # something bad happened }
In the above example, the code reads 30 lines of text output and displays the last temperature and humidity readings in the file (that's why the printf's are outside of the loop, in case you were wondering). The file is filled by an ESP32 which adds to the file every few minutes even when the sensor reports only nan. So thirty lines gets plenty of readings so it should never fail. Each reading includes the date and time so in the final version the output will include the time the reading was taken.

How to reset a persistent counter at a particular value?

I had asked a question earlier( How to keep this counter from reseting at 100,000? ), and now have a follow-up question.
I have another version of the counter in question that can be told to reset at a certain number, and I would like to make sure that this second version does not have the same problem as the first.
What I have coded now is:
$reset = '10';
$filename4 = "$some_variable/$filename3.txt";
// Open our file in append-or-create mode.
$fh = fopen($filename4, "a+");
if (!$fh)
die("unable to create file");
if ($reset == 'default'){
// Before doing anything else, get an exclusive lock on the file.
// This will prevent anybody else from reading or writing to it.
flock($fh, LOCK_EX);
// Place the pointer at the start of the file.
fseek($fh, 0);
// Read one line from the file, then increment the number.
// There should only ever be one line.
$current = 1 + intval(trim(fgets($fh)));
// Now we can reset the pointer again, and truncate the file to zero length.
fseek($fh, 0);
ftruncate($fh, 0);
// Now we can write out our line.
fwrite($fh, $current . "\n");
// And we're done. Closing the file will also release the lock.
fclose($fh);
}
else {
$current = trim(file_get_contents($filename4)) + 1;
if($current >= $reset) {
$new = '0';
fwrite(fopen($filename4, 'w'), $new);
}
else {
fwrite(fopen($filec, 'w'), $current);
}
}
echo $current;
I did not want to assume I know what changes to make to this code, so I post another question. EDIT- What changes should I make here to avoid not getting an exclusive lock on the file if $reset is not equal to default? What is the correct way to code this? Would this work?:
$filename4 = "$some_variable/$filename3.txt";
// Open our file in append-or-create mode.
$fh = fopen($filename4, "a+");
if (!$fh)
die("unable to create file");
// Before doing anything else, get an exclusive lock on the file.
// This will prevent anybody else from reading or writing to it.
flock($fh, LOCK_EX);
// Place the pointer at the start of the file.
fseek($fh, 0);
if ($reset == 'default'){
// Read one line from the file, then increment the number.
// There should only ever be one line.
$current = 1 + intval(trim(fgets($fh)));
} else {
// Read one line from the file, then increment the number.
// There should only ever be one line.
$current = 1 + intval(trim(fgets($fh)));
if($current >= $reset) {
$current = '0';
}
else {
// Read one line from the file, then increment the number.
// There should only ever be one line.
$current = 1 + intval(trim(fgets($fh)));
}
}
// Now we can reset the pointer again, and truncate the file to zero length.
fseek($fh, 0);
ftruncate($fh, 0);
// Now we can write out our line.
fwrite($fh, $current . "\n");
// And we're done. Closing the file will also release the lock.
fclose($fh);
echo $current;
EDIT - This seems to be working for me:
$reset = "default";
$filename4 = "counter.txt";
// Open our file in append-or-create mode.
$fh = fopen($filename4, "a+");
if (!$fh)
die("unable to create file");
// Before doing anything else, get an exclusive lock on the file.
// This will prevent anybody else from reading or writing to it.
flock($fh, LOCK_EX);
// Place the pointer at the start of the file.
fseek($fh, 0);
// Read one line from the file, then increment the number.
// There should only ever be one line.
$current = 1 + intval(trim(fgets($fh)));
if ($reset == 'default'){
$new = $current;
} else {
if($current >= ($reset + '1')) {
$new = '1';
}
else {
$new = $current;
}
}
// Now we can reset the pointer again, and truncate the file to zero length.
fseek($fh, 0);
ftruncate($fh, 0);
// Now we can write out our line.
fwrite($fh, $new . "\n");
// And we're done. Closing the file will also release the lock.
fclose($fh);
echo $new;
Does this look right?
if($current >= $reset) {
// here is where you are setting the counter back to zero. comment out
// these lines.
//$new = '0';
//fwrite(fopen($filename4, 'w'), $new);
}
If you simply want a counter that doesn't get reset, try:
$filename4 = "counter.txt";
// Open our file in append-or-create mode.
$fh = fopen($filename4, "a+");
if (!$fh)
die("unable to create file");
// Before doing anything else, get an exclusive lock on the file.
// This will prevent anybody else from reading or writing to it.
flock($fh, LOCK_EX);
// Place the pointer at the start of the file.
fseek($fh, 0);
// Read one line from the file to get current count.
// There should only ever be one line.
$current = intval(trim(fgets($fh)));
// Increment
$new = $current++;
// Now we can reset the pointer again, and truncate the file to zero length.
fseek($fh, 0);
ftruncate($fh, 0);
// Now we can write out our line.
fwrite($fh, $new . "\n");
// And we're done. Closing the file will also release the lock.
fclose($fh);
echo $new;
The best way I can see to do this would be to open the file for reading with a lock other than exclusive. you can then perform your required checks and if the the count exceeds the $reset value, you can close the the file, open it again but this time with the exclusive lock for writing.
Another way would simply not to use an exclusive lock.
You could look into very good flatfile classes out there which have tested locking mechanisms.
file_put_contents is already atomic. There is no need for ten lines of file locking code.
<?php
$fn = "$filename3.txt";
$reset = 0; // 0 is equivalent to "default"
//$reset = 10000000;
$count = file_get_contents($fn);
$count = ($reset && ($count >= $reset)) ? (0) : ($count + 1);
file_put_contents($fn, $count, LOCK_EX);
echo $count;
No idea if this is any help, since your question is still opaque. I will not answer comments.

Categories