This question already has answers here:
Reading very large files in PHP
(8 answers)
Closed 8 years ago.
I have few text files with size more than 30MB.
How can i read such giant text files from PHP?
Unless you need to work with all the data at the same moment, you can read them in pieces. Example for binary files:
<?php
$handle = fopen("/foo/bar/somefile", "rb");
$contents = '';
while (!feof($handle)) {
$block = fread($handle, 8192);
do_something_with_block($block);
}
fclose($handle);
?>
The above example might break multibyte encodings (in case there's a multibyte character across the 8192-byte boundary - e.g. วพ in UTF-8), so for files that have meaningful endlines (e.g. text), try this:
<?php
$handle = fopen("/foo/bar/somefile", "rb");
$contents = '';
while (!feof($handle)) {
$line = fgets($handle);
do_something_with_line($line);
}
fclose($handle);
?>
You can open the file using fopen, read the lines using fgets.
$fh = fopen("file", "r"); // open file to read.
while (!feof($fh)) { // loop till lines are left in the input file.
$buffer = fgets($fh); // read input file line by line.
.....
}
}
fclose($fh);
Related
How can I have php return just some bytes of a file? Like, it I wanted to load byte 7 through 15 into a string, without reading any other part of the file? Its important that I don't need to load all of the file into memory, as the file could be quite large.
Could use file_get_contents() using the offset and maxlen parameters.
$data = file_get_contents('somefile.txt', false, NULL, 6, 8);
Use fseek() and fread()
$fp = fopen('somefile.txt', 'r');
// move to the 7th byte
fseek($fp, 7);
$data = fread($fp, 8); // read 8 bytes from byte 7
fclose($fp);
Using Pear:
<?php
require_once 'File.php';
//read and output first 15 bytes of file myFile
echo File::read("/path/to/myFile", 15);
?>
Or:
<?php
// get contents of a file into a string
$filename = "/path/to/myFile";
$handle = fopen($filename, "r");
$contents = fread($handle, 15);
fclose($handle);
?>
Either method you can use byte 7-15 to do what you want. I don't think you can go after certain bytes without starting from the beginning of the file.
I would like to read a file until X bytes. But the last line should be NOT cut off like in my current code:
$file = fopen("test.txt", "r");
while(! feof($file)) {
$contents = fread($file,10000);
Right now, fread reads until 10000 bytes are reached. Then cuts the line off and creates a new file. The line basically is completely stores but is split into two files. I dont want do only stop and the end of a line.
Any solutions? Thanks!
I think I got it:
$file = fopen("test.txt", "r");
while(! feof($file)) {
$contents = fread($file,$eachFileSize);
$contents = $contents . fgets($file);
Can someone confirm this is the intended way (e.g. LawrenceCherone suggested)?
I have a large file "file.txt"
I want to read one specific line from the file, change something and then write that line back into its place in the file.
Being that it is a large file, I do not want to read the entire file during the reading or writing process, I only want to access that one line.
This is what I'm using to retrieve the desired line:
$myLine = 100;
$file = new SplFileObject('file.txt');
$file->seek($myLine-1);
$oldline = $file->current();
$newline=str_replace('a','b',$oldline);
Now how do I write this $newline to replace the old line in the file?
You could use this function:
function injectData($file, $data, $position) {
$temp = fopen('php://temp', "rw+");
$fd = fopen($file, 'r+b');
fseek($fd, $position);
stream_copy_to_stream($fd, $temp); // copy end
fseek($fd, $position); // seek back
fwrite($fd, $data); // write data
rewind($temp);
stream_copy_to_stream($temp, $fd); // stich end on again
fclose($temp);
fclose($fd);
}
I got it from: PHP what is the best way to write data to middle of file without rewriting file
I need to base64 encode big file with PHP.
file() and file_get_contents() are not options since them loads whole file into memory.
I got idea to use this:
$handle = #fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (($buffer = fgets($handle, 4096)) !== false) {
echo $buffer;
}
fclose($handle);
}
SOURCE: Read and parse contents of very large file
This working well for reading, but is it possible to do it like this:
Read line -> base64 encode -> write back to file
And then repeat for each line in file.
Would be nice if it could do it directly, without need to write to temporary file.
Base64 encodes 3 bytes of raw data into 4 bytes on 7-bit safe text. If you feed it less than 3 bytes padding will occur, and you can't have that happen in the middle of the string. However, so long as you're dealing in multiples of 3 you're golden, sooo:
$base_unit = 4096;
$handle = #fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (($buffer = fread($handle, $base_unit*3)) !== false) {
echo base64_encode($buffer);
}
fclose($handle);
}
http://en.wikipedia.org/wiki/Base64#Examples
I need to scan through a 30MB text file - it's a list of world cities - How can I access this file, I feel like a File_Get_Contents will give my server a stroke
Just fopen it and then use fgets.
Filesystem functions come handy in this situation.
Example
$filename = "your_file_path";
// to open file
$fp = fopen($filename, 'r'); // use 'rw' to open file in read/write mode
// to output entire file
echo fread($fp, filesize($filename));
// to close file
fclose($fp);
References
(some handy functions)
All Filesystem Functions
fopen() - open file
fread() - read file content
fgets() - to get line
fwrite() - write content to file
fseek() - change file pointer's position
rewind() - rewind file pointer to pos 0
fclose() - close file
...
<?php
$fh = #fopen("inputfile.txt", "r");
if ($fh) {
while (($line = fgets($fh)) !== false) {
echo $line;
// do something with $line..
}
fclose($fh);
}
?>
More information/examples on http://pt.php.net/manual/en/function.fgets.php