I am developing a log file viewer in php that should read 10 lines from the file (say 2 GB ) and when user clicks next then the consequent 10 lines has to be read.
when back button is pressed the last 10 lines has to be printed.
As of now I have implemented file read using fgets (due to size of file) and I trying to figure out how to seek the next 10 and previous 10 lines.
if($handle)
{
$cnt=1;
while(($buffer=fgets($handle))!==false and $cnt<=10) {
echo $buffer;
$cnt++;
}
if(feof($handle)) {
echo "error";
}
}
The SplFileObject class in PHP does what you want to do. See:
http://php.net/manual/en/splfileobject.seek.php
Example code:
<?php
// Set $lineNumber to the line that you want to start at
// Remember that the first line in the file is line 0
$lineNumber = 43;
// This sets how many lines you want to grab
$lineCount = 10;
// Open the file
$file = new SplFileObject("logfile.log");
// This seeks to the line that you want to start at
$file->seek($lineNumber);
for($currentLine=0; $currentLine < $lineCount; $currentLine++) {
echo $file->current();
$file->next();
}
?>
Related
The target is how to read a range of rows/lines from large CSV file into a JSON array in order to handle large files and read the data in pagination method, each page fetches a range of lines ( e.x. page number 1 fetch from line 1 to 10, page number 2 fetch from line 11 to line 20, and so and ).
the below PHP script read from the being CSV file to the desired line ($desired_line), My question is how we can determine the starting line to read from a specific line ($starting_line)
<?php
// php function to convert csv to json format
function csvToJson($fname, $starting_line, $desired_line) {
// open csv file
if (!($fp = fopen($fname, 'r'))) {
die("Can't open file...");
}
//read csv headers
$key = fgetcsv($fp,"1024","\t");
$line_counter = 0;
// parse csv rows into array
$json = array();
while (($row = fgetcsv($fp,"1024","\t")) && ($line_counter < $desired_line)) {
$json[] = array_combine($key, $row);
$line_counter++;
}
// release file handle
fclose($fp);
// encode array to json
return json_encode($json);
}
// Define the path to CSV file
$csv = 'file.csv';
print_r(csvToJson($csv, 20, 30));
?>
You should use functions like:
fgets() to read the file line by line
fseek() to move to the position of the last fgets() of the chunk
ftell() to read the position for fseek()
Something like this (it's only a schema):
<?php
...
$line_counter = 0;
$last_pos = ...
$fseek($fp,$last_pos);
while($line = fgets($fp)){ // read a line of the file
$line_counter++;
(...) // parse line of csv here
if($line_counter == 100){
$lastpos = ftell($fp);
(...) // save the $lastpos for next reading cycle
break;
}
}
...
?>
You can also skip the fseek() and ftell() part and just count the lines every time from the beginning, but that will generally have to go through the whole file from the beginning till the desired lines.
I have a list.txt file which I get its contents with fgets. I then echo the contents of each line in the list.txt file in a while loop until fgets reaches end of file. Now, I want to delete a line after it has been echoed in the lists.txt file after it has been echoed.
I've tried putting the lines in another file (list2.txt) and then using the ideas of put_contents but i've been unsuccessful in doing that and in a few other things i've thought of to try. I can't help but feel like i'm overthinking it.
$list = fopen("list.txt","r");
while(! feof($list))
{
try{
$lines = fgets($list);
echo "$lines \n";
// I don't need to delete the lines here
}
catch (Exception $e)
{
echo "Error \n ";
// I want to delete the lines here
exit;
}
}
fclose($list);
// I want to delete the lines here
I was able to finally do this using file() and for loop but couldn't with fgets and while loop. Barmar's comment above pointed me in the right direction but implode wasn't cutting it for me.
$list = file('list.txt'); // opens list.txt into an array
$listcount = count($list); // counts the number of lines/array elements
for ($x = 0; $x <= $listcount; $x++) //for loop to echo and delete each line
{
echo "\n $list[$x]\n";
unset($list[$x]); // deletes the lines that's just being echoed
file_put_contents("list.txt", $list); // puts the remaining contents of
//the array back into list.txt file
}
I am reading the file and getting the particular line if there is a match for the searched string. There are bunch of strings to be searched which are stored in a array. I cant be opening the file every time when i loop through the array to get the string. But want to go to the first line of the file and start searching again. The file contains around 15k lines. If i open the file every time(inside the loop) its working fine. but if the open the file outside the loop. Only the first matched string line is returned.
$scheme_code =
array("106212","112422","114239","104685","100122","118191","131666");
foreach($scheme_code as $searchthis) {
$handle = #fopen("myfile", "r");
//DONT WANT TO DO THE ABOVE LINE FOR EVERY ITERATION
if ($handle)
{
//echo "handle open"."<br>";
while (!feof($handle))
{
$buffer = fgets($handle,4096);
if(strpos($buffer, $searchthis) !== FALSE){
$matches[] = $buffer;
}
}
}
}
But want to do something like this
$handle = #fopen("Myfile", "r");
foreach(){
// inside foreach
//go to the first line of the file
}
fclose($handle);
EDIT - I tried rewind(). I got the notice "rewind(): stream does not support seeking"
Here you can use file() function which will give you complete array of lines and after that you can match line by line without using your IO resource everytime by fopen.
<?php
$linesArray = file("/path/to/your/file.txt");
foreach($linesArray as $line){
// do the stuff or matching you want to perform line by line on $line
}
I wrote some code below, at the moment I'm testing so there's no database queries in the code.
The code below where it says if(filesize($filename) != 0) always goes to else even though the file is not 0 bytes and has 16 bytes of data in there. I am getting nowhere, it just always seems to think file is 0 bytes.
I think it's easier to show my code (could be other errors in there but I'm checking each error as I go along, dealing with them one by one). I get no PHP errors or anything.
$filename = 'memberlist.txt';
$file_directory = dirname($filename);
$fopen = fopen($filename, 'w+');
// check is file exists and is writable
if(file_exists($filename) && is_writable($file_directory)){
// clear statcache else filesize could be incorrect
clearstatcache();
// for testing, shows 0 bytes even though file is 16 bytes
// file has inside without quotes: '1487071595 ; 582'
echo "The file size is actually ".filesize($filename)." bytes.\n";
// check if file contains any data, also tried !==
// always goes to else even though not 0 bytes in size
if(filesize($filename) != 0){
// read file into an array
$fread = file($filename);
// get current time
$current_time = time();
foreach($fread as $read){
$var = explode(';', $read);
$oldtime = $var[0];
$member_count = $var[1];
}
if($current_time - $oldtime >= 86400){
// 24 hours or more so we query db and write new member count to file
echo 'more than 24 hours has passed'; // for testing
} else {
// less than 24 hours so don't query db just read member count from file
echo 'less than 24 hours has passed'; // for testing
}
} else { // WE ALWAYS END UP HERE
// else file is empty so we add data
$current_time = time().' ; ';
$member_count = 582; // this value will come from a database
fwrite($fopen, $current_time.$member_count);
fclose($fopen);
//echo "The file is empty so write new data to file. File size is actually ".filesize($filename)." bytes.\n";
}
} else {
// file either does not exist or cant be written to
echo 'file does not exist or is not writeable'; // for testing
}
Basically the code will be on a memberlist page which currently retrieves all members and counts how many members are registered. The point in the script is if the time is less than 24 hours we read the member_count from file else if 24 hours or more has elapsed then we query database, get the member count and write new figure to file, it's to reduce queries on the memberlist page.
Update 1:
This code:
echo "The file size is actually ".filesize($filename)." bytes.\n";
always outputs the below even though it's not 0 bytes.
The file size is actually 0 bytes.
also tried
var_dump (filesize($filename));
Outputs:
int(0)
You are using:
fopen($filename, "w+")
According to the manual w+ means:
Open for reading and writing; place the file pointer at the beginning of the file and truncate the file to zero length. If the file does not exist, attempt to create it.
So the file size being 0 is correct.
You probably need r+
Sorry I know this question is closed but I am writing my own answer so it might be useful for someone else
if use c+ in fopen function ,
fopen($filePath , "c+");
then the filesize() function return size of file
and you can use clearstatcache($filePath) to clear the cache of this file.
notice: when we use c+ in fopen() and then use the fread(), function reserve the file content and place our string at the end of file content
I have a script im writing. Here is whats happening. There is a while loop. In the while loop is a variable which is constant to X. How do i make X change from line one, line two, etc for each cycle of the while loop and pull X from a .txt file. Everything is in root. Thanks
$f = fopen("some.txt", "r");
while (!feof($f) && $some_condition) {
$x = fgets($f);
// do something
}
fclose($f);
Would this be sufficient?
Here is the pseudo code captain Kirk:
//we assume current working directory is root
fileHandle = openFile("Read","some.txt");
X = pull("X",fileHandle);
while( X is constant )
{
XFactor = factor(X);
}
I can refine and improve this with more details about what universe you are from, the programming language you intend to use, and more specifics about what you want to happen.
//get the lines of the file into an array
$file_array = file($file_name);
//go through the array line by line
foreach ($file_array as $line_number => $line)
{
//you didn't tell us what you are doing with each line
//so you will need to change this to your liking
$X = $line; // Handle the line
}
Edit: Note for very large files this may not be a good approach because this will load the entire file into memory at one time.