I have a PHP loop whereby I read and process the contents of several files.
<?php
foreach($files as $file)
{
$f = fopen($file, 'r');
$content = fread($f, filesize($file));
import($content);
fclose($f);
}
?>
However, after several iterations of the loop, I am still getting a memory exhausted error on the fread() line - my understanding was that using fclose() would free up the memory of the resource, but is this not the case?
I have read a number of other questions on the matter and they all referenced using fclose(), which I was already doing. I'm also dumping out memory_get_usage()/memory_get_peak_usage() and they just keep going up until it fails.
Am I misunderstanding how fclose() handles freeing up memory?
<?php
foreach($files as $file)
{
$f = fopen($file, 'r');
$content = fread($f, filesize($file));
import($content);
fclose($f); // this close file
unset($content); // this free memory allocated to content of file
}
?>
Related
Can we access some of the system's heap memory with PHP like in C/C++? Actually, I keep getting Fatal error: Allowed memory size of 134217728 bytes exhausted while trying to do some big file operation in PHP.
I know we can tweak this limit in Apache2 config. But, for processing large unknown sized files and data can we have some kind of access to the heap to process & save the file? Also, if yes, then is there a mechanism to clear the memory after usage?
Sample code
<?php
$filename = "a.csv";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
echo $contents;
fclose($handle);
?>
Here, a.csv is a 80mb file. Can there be a heap operation using some sort of pointer?
Have you tried reading the file in chunks, e.g.:
<?php
$chunk = 256 * 256; // set chunk size to your liking
$filename = "a.csv";
$handle = fopen($filename, 'rb');
while (!feof($handle))
{
$data = fread($handle, $chunk);
echo $data;
ob_flush();
flush();
}
fclose($handle);
I have a script processing large text files.
I am limited however by the size of the files.
I've done some searching on this forum, and i've come to the conclusion that i must process the file line by line, however this brings up quite a bit of issues for me, as i need some detailed info from the file, before i can start processing it.
I've tried adding each line to a variable as below:
$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 256);
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
And just as expected, it did not work.
Could anyone help me out ?
add this on top of your file
ini_set("memory_limit", -1);
Be careful, the page can use all the RAM on the server
$content = "";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (($buffer = fgets($handle, 4096))!== false) {
// Process buffer here..
$content .= $buffer;
}
fclose($handle);
}
OR
$content = "";
$target_File ="/tmp/uploadfile.txt";
$handle = fopen($target_File, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
// Process buffer here..
//$content .= $buffer;
}
fclose($handle);
}
NOTE:
If the problem is caused by hitting the memory limit, you can try setting it a higher value (this could work or not depending on php's configuration).
this sets the memory limit to 32 Mb
ini_set("memory_limit","32M");
I am trying to write to a file and then read the data from the same file. But sometimes I am facing this issue that the file reading process is getting started even before the file writing gets finished. How can I solve this issue ? How can i make file writing process finish before moving ahead?
// writing to file
$string= <12 kb of specific data which i need>;
$filename.="/ttc/";
$filename.="datasave.html";
if($fp = fopen($filename, 'w'))
{
fwrite($fp, $string);
fclose($fp);
}
// writing to the file
$handle = fopen($filename, "r") ;
$datatnc = fread($handle, filesize($filename));
$datatnc = addslashes($datatnc);
fclose($handle);
The reason it does not work is because when you are done writing a string to the file the file pointer points to the end of the file so later when you try to read the same file with the same file pointer there is nothing more to read. All you have to do is rewind the pointer to the beginning of the file. Here is an example:
<?php
$fileName = 'test_file';
$savePath = "tmp/tests/" . $fileName;
//create file pointer handle
$fp = fopen($savePath, 'r+');
fwrite($fp, "Writing and Reading with same fopen handle!");
//Now rewind file pointer to start reading
rewind($fp);
//this will output "Writing and Reading with same fopen handle!"
echo fread($fp, filesize($savePath));
fclose($fp);
?>
Here is more info on the rewind() method http://php.net/manual/en/function.rewind.php
I have mentioned the URL through which i got the solution. I implemented the same. If you want me to copy the text from that link then here it is :
$file = fopen("test.txt","w+");
// exclusive lock
if (flock($file,LOCK_EX))
{
fwrite($file,"Write something");
// release lock
flock($file,LOCK_UN);
}
else
{
echo "Error locking file!";
}
fclose($file);
Use fclose after writing to close the file pointer and then fopen again to open it.
I want some PHP to do the following in this order:
Gain exclusive lock to a file (waiting if already locked)
Read the contents of the file
Empty the file of all contents
Remove the lock
But any code I'm coming up with one way or another always relinquishes the lock between the reading and writing.
$fp = fopen('status.txt', 'r+');
flock($fp, LOCK_EX);
$str = fread($fp,1000); // [another hack. I just want it to read everything]
unlink('status.txt');
touch('status.txt');
Any ideas? I don't trust anything I do with files.
I think ftruncate can do what you want, since it works on a file that you already have open.
http://www.php.net/manual/en/function.ftruncate.php
Here's their example:
<?php
$filename = 'lorem_ipsum.txt';
$handle = fopen($filename, 'r+');
ftruncate($handle, rand(1, filesize($filename)));
rewind($handle);
echo fread($handle, filesize($filename));
fclose($handle);
?>
So I think what you want then is something like:
$fp = fopen('status.txt', 'r+');
flock($fp, LOCK_EX);
$str = fread($fp, filesize('status.txt'));
ftruncate($fp, 0);
flock($fp, LOCK_UN);
fclose($fp);
I need to scan through a 30MB text file - it's a list of world cities - How can I access this file, I feel like a File_Get_Contents will give my server a stroke
Just fopen it and then use fgets.
Filesystem functions come handy in this situation.
Example
$filename = "your_file_path";
// to open file
$fp = fopen($filename, 'r'); // use 'rw' to open file in read/write mode
// to output entire file
echo fread($fp, filesize($filename));
// to close file
fclose($fp);
References
(some handy functions)
All Filesystem Functions
fopen() - open file
fread() - read file content
fgets() - to get line
fwrite() - write content to file
fseek() - change file pointer's position
rewind() - rewind file pointer to pos 0
fclose() - close file
...
<?php
$fh = #fopen("inputfile.txt", "r");
if ($fh) {
while (($line = fgets($fh)) !== false) {
echo $line;
// do something with $line..
}
fclose($fh);
}
?>
More information/examples on http://pt.php.net/manual/en/function.fgets.php