I sense that I am almost there.
Here is a .txt file, which is about 60 Kbytes and full of German words. Every word is on a new line.
I want to iterate through it with this code:
<?php
$file = "GermanWords.txt";
$f = fopen($file,"r");
$parts = explode("\n", $f);
foreach ($parts as &$v)
{
echo $v;
}
?>
When I execute this code, I get: Resourceid#2
The word resource is not in the .txt, I do not know where it comes from.
How can I manage to show up all words in the txt?
No need for fopen just use file_get_contents:
$file = "GermanWords.txt";
$contents = file_get_contents($file);
$lines = explode("\n", $contents); // this is your array of words
foreach($lines as $word) {
echo $word;
}
fopen() just opens the file, it doesn't read it -- In your code, $f contains a file handle, not the file contents. This is where the word "Resource" comes from; it's PHP's internal name for the file handle.
One answer would be to replace fopen() with file_get_contents(). This opens and reads the file in one action. This would solve the problem, but if the file is big, you probably don't want to read the whole thing into memory in one go.
So I would suggest instead using SplFileObject(). The code would look like this:
<?php
$file = "GermanWords.txt";
$parts = new SplFileObject($file);
foreach ($parts as $line) {
echo $line;
}
?>
It only reads into memory one line at at time, so you don't have to worry about the size of the file.
Hope that helps.
See the PHP manual for more info: http://php.net/manual/en/splfileobject.construct.php
$f, the result of fopen is a resource, not the contents of the file. If you just want an array of the lines contained in the file, you can use file:
$parts = file('GermanWords.txt');
foreach($parts as $v){
echo $v;
}
Alternatively, if you want to stick with fopen you can use fread to read the content:
$f = fopen('GermanWords.txt', 'r');
// read the entire file into $contents
$contents = fread($f, filesize('GermanWords.txt'));
fclose($handle);
$parts = explode("\n", $contents);
The SplFileObject provides a way to do that :
$file = new SplFileObject("file.txt");
while (!$file->eof()) {
echo $file->fgets();
}
And if you prefer the foreach loop, you can create a generator function for that :
function lines($filename) {
$file = new SplFileObject($filename);
while (!$file->eof()) {
yield $file->fgets();
}
}
foreach (lines('German.txt') as $line) {
echo $line;
}
Reading the entire content of the file (with file_get_contents) before treating it can be memory consuming.
If you want to treat a file line by line, this class might help you.
It implements an Iterator (see phpdoc about it), that can be walked through in a foreach loop. Only the last line read is stored in memory.
class TxtFileIterator implements \Iterator{
protected $fileHandler;
protected $key;
protected $current;
protected $fileName;
function __construct($fileName){
$this->fileHandler = fopen($fileName, "r") or die("Unable to open file!");
$this->fileName = $fileName;
$this->key = 0;
}
function __destruct(){
fclose( $this->fileHandler );
}
//Iterator interface
public function current (){
return $this->current;
}
public function key (){
return $this->key;
}
public function next (){
if ( $this->valid() ){
$this->current = fgets( $this->fileHandler );
$this->key++;
}
}
public function rewind (){
$this->__destruct();
$this->__construct( $this->fileName );
}
public function valid (){
return !feof( $this->fileHandler );
}
Usage :
$iterator = new TxtFileIterator("German.txt");
foreach ($iterator as $line) {
echo $line;// or do whatever you want with line
}
Related
I'm trying to make my PHP script open more than 1 text document and to read them.
My current script is as follows:
<?php
//$searchthis = "ignore this";
$matches = array();
$FileW = fopen('result.txt', 'w');
$handle = #fopen("textfile1.txt", "r");
ini_set('memory_limit', '-1');
if ($handle)
{
while (!feof($handle))
{
$buffer = fgets($handle);
if(stripos($buffer, $_POST["search"]) !== FALSE)
$matches[] = $buffer;
}
fwrite($FileW, print_r($matches, TRUE));
fclose($handle);
}
?>
I'm trying to fopen like a bunch of files, maybe like 8 of them or less.
How would I open, and read all these files?
Any help is GREATLY appreciated!
Program defensively, check the return's from functions to ensure you are not making incorrect assumptions about your code.
There is a function in PHP to read the file and buffer it:
enter link description here
I don't know why you would want to open a lot of files, it surely will use a lot of memory, anyway, you could use the file_get_contents function with a foreach:
$files = array("textfile1.txt", "textfile2.txt", "textfile3.txt");
$data = "";
foreach ($files as $file) {
$data .= #file_get_contents($file);
}
echo $data;
There is a function in php called file which reads entire file into an array.
<?php
// "file" function creates array with each line being 1 value to an array
$fileOne = file('fileOne.txt');
$fileTwo = file('fileTwo.txt');
// Print an array or do all array magic with $fileOne and $fileTwo
foreach($fileOne as $fo) {
echo $fo;
}
foreach($fileTwo as $ft) {
$echo $ft;
}
?>
Read more about : file function ion php
I am reading a file containing around 50k lines using the file() function in Php. However, its giving a out of memory error since the contents of the file are stored in the memory as an array. Is there any other way?
Also, the lengths of the lines stored are variable.
Here's the code. Also the file is 700kB not mB.
private static function readScoreFile($scoreFile)
{
$file = file($scoreFile);
$relations = array();
for($i = 1; $i < count($file); $i++)
{
$relation = explode("\t",trim($file[$i]));
$relation = array(
'pwId_1' => $relation[0],
'pwId_2' => $relation[1],
'score' => $relation[2],
);
if($relation['score'] > 0)
{
$relations[] = $relation;
}
}
unset($file);
return $relations;
}
Use fopen, fread and fclose to read a file sequentially:
$handle = fopen($filename, 'r');
if ($handle) {
while (!feof($handle)) {
echo fread($handle, 8192);
}
fclose($handle);
}
EDIT after update of question and comments to answer of fabjoa:
There is definitely something fishy if a 700kb file eats up 140MB of memory with that code you gave (you could unset $relation at the end of the each iteration though). Consider using a debugger to step through it to see what happens. You might also want to consider rewriting the code to use SplFileObject's CSV functions as well (or their procedural cousins)
SplFileObject::setCsvControl example
$file = new SplFileObject("data.csv");
$file->setFlags(SplFileObject::READ_CSV);
$file->setCsvControl('|');
foreach ($file as $row) {
list ($fruit, $quantity) = $row;
// Do something with values
}
For an OOP approach to iterate over the file, try SplFileObject:
SplFileObject::fgets example
$file = new SplFileObject("file.txt");
while (!$file->eof()) {
echo $file->fgets();
}
SplFileObject::next example
// Read through file line by line
$file = new SplFileObject("misc.txt");
while (!$file->eof()) {
echo $file->current();
$file->next();
}
or even
foreach(new SplFileObject("misc.txt") as $line) {
echo $line;
}
Pretty much related (if not duplicate):
How to save memory when reading a file in Php?
If you don't know the maximum line length and you are not comfortable to use a magic number for the max line length then you'll need to do an initial scan of the file and determine the max line length.
Other than that the following code should help you out:
// length is a large number or calculated from an initial file scan
while (!feof($handle)) {
$buffer = fgets($handle, $length);
echo $buffer;
}
Old question but since I haven't seen anyone mentioning it, PHP generators is a great way to reduce save memory consumption.
For example:
function read($fileName)
{
$fileHandler = fopen($fileName, 'rb');
while(($line = fgets($fileHandler)) !== false) {
yield rtrim($line, "\r\n");
}
fclose($fileHandler);
}
foreach(read(__DIR__ . '/filenameHere') as $line) {
echo $line;
}
allocate more memory during the operation, maybe something like ini_set('memory_limit', '16M');. Don't forget to go back to initial memory allocation once operation is done
I am using fopen to reach my PHP file :
$readFd = #fopen($file, 'r+');
I would like to search this file for the function call parent::process();
And if this exists I would then insert a new function call after this.
I have tried using preg_replace but it does not seem to match parent::process();
For example the result I need is this.
public function process() {
parent::process();
$this->newFunction();
}
Then to write the to the file I am using :
fwrite($readFd, $content);
I guess I must be missing something important with regex.
Hopefully someone can point me in the right direction.
I would use the php function fgets to read every in the file one by one until you reach the line you need. And then your pointer will be after that line where you can write your own line.
EDIT
I was wrong, when you write something to a file at a specific point, everything after that point is lost. So I did a little testing and came up with this:
$handle = fopen($file,"r+");
$lines = array();
while(($line = fgets($handle)) !== false) {
$lines[] = $line;
if(strpos($line, 'parent::process()')) {
$lines[] = '$this->newFunction();';
}
}
fseek($handle, 0); // reset pointer
foreach($lines as $line) {
fwrite($handle, $line);
}
fclose($handle);
I hope this solves your problem.
I came up with the solution however your code seems much shorter so I will try your solution tomorrow.
if(! $readFd = #fopen($file, "r+"))
return FALSE;
$buffer = fread($readFd, 120000);
fclose($readFd);
$onDuplicate = FALSE;
$lines = explode("\n", $buffer);
foreach($lines AS $key => $line) {
if(strpos($line, "newFunction()")) {
$onDuplicate = TRUE;
}
if(strpos($line, "parent::process()")) {
$lines[$key] = "\t\tparent::process();\n\t\t//\$this->newFunction();\n";
}
}
if(! $onDuplicate) {
$readFd = fopen($file, "w");
$buffer = implode("\n", $lines)."\n";
fwrite($readFd, $buffer);
fclose($readFd);
} else {
var_dump('changes are already applied');
}
Thanks for all your help!
Below code splits my file every 10 lines, but I want it to split everytime
</byebye>
occurs. That way, I will get multiple files each containing;
<byebye>
*stuff here*
</byebye>
Code:
<?php
/**
*
* Split large files into smaller ones
* #param string $source Source file
* #param string $targetpath Target directory for saving files
* #param int $lines Number of lines to split
* #return void
*/
function split_file($source, $targetpath='files/', $lines=10){
$i=0;
$j=1;
$date = date("m-d-y");
$buffer='';
$handle = #fopen ($source, "r");
while (!feof ($handle)) {
$buffer .= #fgets($handle, 4096);
$i++;
if ($i >= $lines) {
$fname = $targetpath.".part_".$date.$j.".xml";
if (!$fhandle = #fopen($fname, 'w')) {
echo "Cannot open file ($fname)";
exit;
}
if (!#fwrite($fhandle, $buffer)) {
echo "Cannot write to file ($fname)";
exit;
}
fclose($fhandle);
$j++;
$buffer='';
$i=0;
$line+=10; // add 10 to $lines after each iteration. Modify this line as required
}
}
fclose ($handle);
}
split_file('testxml.xml')
?>
Any ideas?
If I understand you right, this should do it.
$content = file_get_contents($source);
$parts = explode('</byebye>', $content);
$parts = array_map('trim', $parts);
Then just write the parts to the different files
$dateString = date('m-d-y');
foreach ($parts as $index => $part) {
file_put_contents("{$targetpath}part_{$dateString}{$index}.xml", $part);
}
But I assume (without knowing your source), that this will result in invalid xml. You should use one of the XML-Parser (SimpleXML, DOM, ..) to handle xml-files.
Sidenote: You use # much much too much.
If you are worried about sizes you can switch to a file resource and use fread or fgets to control the amount of memory you are hitting.
$f = fopen($source, "r");
$out = '';
while (!feof($f))
{
$line .= fgets($f);
$arr = explode('</byebye>', $line);
$out .= $arr[0];
if (count($arr) == 1)
continue;
else
{
// file_put_contents here
// will need to handle lines with multiple </byebye> entries here,
// outputting as necessary
// replace $out with the final entry of the $arr array onto
}
}
You can also save more memory by opening up the file for output, and as you parse, pipe the contents to it. When you encounter a entry you would close the file and open the next one.
I am reading a file containing around 50k lines using the file() function in Php. However, its giving a out of memory error since the contents of the file are stored in the memory as an array. Is there any other way?
Also, the lengths of the lines stored are variable.
Here's the code. Also the file is 700kB not mB.
private static function readScoreFile($scoreFile)
{
$file = file($scoreFile);
$relations = array();
for($i = 1; $i < count($file); $i++)
{
$relation = explode("\t",trim($file[$i]));
$relation = array(
'pwId_1' => $relation[0],
'pwId_2' => $relation[1],
'score' => $relation[2],
);
if($relation['score'] > 0)
{
$relations[] = $relation;
}
}
unset($file);
return $relations;
}
Use fopen, fread and fclose to read a file sequentially:
$handle = fopen($filename, 'r');
if ($handle) {
while (!feof($handle)) {
echo fread($handle, 8192);
}
fclose($handle);
}
EDIT after update of question and comments to answer of fabjoa:
There is definitely something fishy if a 700kb file eats up 140MB of memory with that code you gave (you could unset $relation at the end of the each iteration though). Consider using a debugger to step through it to see what happens. You might also want to consider rewriting the code to use SplFileObject's CSV functions as well (or their procedural cousins)
SplFileObject::setCsvControl example
$file = new SplFileObject("data.csv");
$file->setFlags(SplFileObject::READ_CSV);
$file->setCsvControl('|');
foreach ($file as $row) {
list ($fruit, $quantity) = $row;
// Do something with values
}
For an OOP approach to iterate over the file, try SplFileObject:
SplFileObject::fgets example
$file = new SplFileObject("file.txt");
while (!$file->eof()) {
echo $file->fgets();
}
SplFileObject::next example
// Read through file line by line
$file = new SplFileObject("misc.txt");
while (!$file->eof()) {
echo $file->current();
$file->next();
}
or even
foreach(new SplFileObject("misc.txt") as $line) {
echo $line;
}
Pretty much related (if not duplicate):
How to save memory when reading a file in Php?
If you don't know the maximum line length and you are not comfortable to use a magic number for the max line length then you'll need to do an initial scan of the file and determine the max line length.
Other than that the following code should help you out:
// length is a large number or calculated from an initial file scan
while (!feof($handle)) {
$buffer = fgets($handle, $length);
echo $buffer;
}
Old question but since I haven't seen anyone mentioning it, PHP generators is a great way to reduce save memory consumption.
For example:
function read($fileName)
{
$fileHandler = fopen($fileName, 'rb');
while(($line = fgets($fileHandler)) !== false) {
yield rtrim($line, "\r\n");
}
fclose($fileHandler);
}
foreach(read(__DIR__ . '/filenameHere') as $line) {
echo $line;
}
allocate more memory during the operation, maybe something like ini_set('memory_limit', '16M');. Don't forget to go back to initial memory allocation once operation is done