I wrote a php script which outputs html files on the screen which uses readfile($htmlFile);
however in the web-hosting that I have purchased the readfile() has been disabled for security reasons.
Is there any substitution ( other php functions) for the readfile() or I have no choice but to ask the admin to enable it for me?
Thanks
You can check which functions are disabled by using:
var_dump(ini_get('disable_functions'));
You can try to use fopen() and fread() instead:
http://nl2.php.net/manual/en/function.fopen.php
http://nl2.php.net/manual/en/function.fread.php
$file = fopen($filename, 'rb');
if ( $file !== false ) {
while ( !feof($file) ) {
echo fread($file, 4096);
}
fclose($file);
}
Or fopen() with fpassthru()
$file = fopen($filename, 'rb');
if ( $file !== false ) {
fpassthru($file);
fclose($file);
}
Alternatively you can use fwrite() to write content.
You can also try to use file_get_contents()
http://nl2.php.net/file_get_contents
Or you can use file()
http://nl2.php.net/manual/en/function.file.php
I wouldn't recommend this method though, but if nothing works...
$data = file($filename);
if ( $data !== false ) {
echo implode('', $data);
}
If its disabled then you could do something like following as alternative:
$file = fopen($yourFileNameHere, 'rb');
if ( $file !== false ) {
while ( !feof($file) ) {
echo fread($file, 4096);
}
fclose($file);
}
//OR
$contents = file_get_contents($yourFileNameHere); //if for smaller files
Hope it helps
You can try :
$path = '/some/path/to/file.html';
$file_string = '';
$file_content = file($path);
// here is the loop
foreach ($file_content as $row) {
$file_string .= $row;
}
// finally print it
echo $file_string;
Related
I am working on securing the content read from a file via the #fread() function.
private function readfile_chunked($file) {
$chunksize = 1024 * 1024;
// Open Resume
$handle = #fopen($file, 'r');
if (false === $handle) {
return FALSE;
}
while (!#feof($handle)) {
$content = #fread($handle, $chunksize);
echo wp_kses_post( $content);
if (ob_get_length()) {
ob_flush();
flush();
}
}
return #fclose($handle);
}
The aforementioned wp_kses_post($content) is suggested by the WP plugin review team to secure the file content but this solution is not working for me. It is downloading the file in a loop. Any help will be appreciated on "How we can escape output of #fread() in WordPress?". Or any alternative function so we can skip the "echo" function.
Thanks
private function readfile_chunked($file) {
$chunksize = 1024 * 1024;
// Open Resume
$handle = #fopen($file, 'r');
if (false === $handle) {
return FALSE;
}
$output_resource = fopen( 'php://output', 'w' );
while (!#feof($handle)) {
$content = #fread($handle, $chunksize);
fwrite( $output_resource, $content );
if (ob_get_length()) {
ob_flush();
flush();
}
}
return #fclose($handle);
}
You can write to memory as an output stream instead of echo.
I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}
I have a .txt file with millions of lines of text
The code below Delete a specific line (.com domains) in a .txt file. But large files can not do :(
<?php
$fname = "test.txt";
$lines = file($fname);
foreach($lines as $line) if(!strstr($line, ".com")) $out .= $line;
$f = fopen($fname, "w");
fwrite($f, $out);
fclose($f);
?>
I want to remove certain lines and put them in another file
For example, the list of domain names of sites. cut the .com domain and paste it in another file...
Here's an approach using http://php.net/manual/en/class.splfileobject.php and working with a temporary file.
$fileName = 'whatever.txt';
$linesToDelete = array( 3, 5 );
// Working File
$file = new SplFileObject( $fileName, 'a+' );
$file->flock( LOCK_EX );
// Temp File
$temp = new SplTempFileObject( 0 );
$temp->flock( LOCK_EX );
// Wite the temp file without the lines
foreach( $file as $key => $line )
{
if( in_array( $key + 1, $linesToDelete ) === false )
{
$temp->fwrite( $line );
}
}
// Write Back to the main file
$file->ftruncate(0);
foreach( $temp as $line )
{
$file->fwrite( $line );
}
$file->flock( LOCK_UN );
$temp->flock( LOCK_UN );
This may be slow though, but a 40 meg file with 140000 lines takes 2.3 seconds on my windows xampp setup. This could be sped up by writing to a temp file and doing a file move, but I didn't want to step on file permissions in your environment.
Edit: Solution using Rename/Move instead of second write
$fileName = __DIR__ . DIRECTORY_SEPARATOR . 'whatever.txt';
$linesToDelete = array( 3, 5 );
// Working File
$file = new SplFileObject( $fileName, 'a+' );
$file->flock( LOCK_EX );
// Temp File
$tempFileName = tempnam( sys_get_temp_dir(), rand() );
$temp = new SplFileObject( $tempFileName,'w+');
$temp->flock( LOCK_EX );
// Write the temp file without the lines
foreach( $file as $key => $line )
{
if( in_array( $key + 1, $linesToDelete ) === false )
{
$temp->fwrite( $line );
}
}
// File Rename
$file->flock( LOCK_UN );
$temp->flock( LOCK_UN );
unset( $file, $temp ); // Kill the SPL objects relasing further locks
unlink( $fileName );
rename( $tempFileName, $fileName );
It could be because of the large size of the file that its taking too much of space.
When you do file('test.txt'), it reads the entire file into an array.
Instead, you can try using Generators.
GeneratorsExample.php
<?php
class GeneratorsExample {
function file_lines($filename) {
$file = fopen($filename, 'r');
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
}
function copyFile($srcFile, $destFile) {
foreach ($this->file_lines($srcFile) as $line) {
if(!strstr($line, ".com")) {
$f = fopen($destFile, "a");
fwrite($f, $line);
fclose($f);
}
}
}
}
callingFile.php
<?php
include('GeneratorsExample.php');
$ob = new GeneratorsExample();
$ob->copyFile('file1.txt', 'file2.txt')
While you could use tens of lines of PHP code, one line of shell code will do.
$ grep Bar.com stuff.txt > stuff2.txt
or as PHP
system ("grep Bar.com stuff.txt > stuff2.txt");
I want to read a file line by line, but without completely loading it in memory.
My file is too large to open in memory, and if try to do so I always get out of memory errors.
The file size is 1 GB.
You can use the fgets() function to read the file line by line:
$handle = fopen("inputfile.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process the line read.
}
fclose($handle);
}
if ($file = fopen("file.txt", "r")) {
while(!feof($file)) {
$line = fgets($file);
# do same stuff with the $line
}
fclose($file);
}
You can use an object oriented interface class for a file - SplFileObject http://php.net/manual/en/splfileobject.fgets.php (PHP 5 >= 5.1.0)
<?php
$file = new SplFileObject("file.txt");
// Loop until we reach the end of the file.
while (!$file->eof()) {
// Echo one line from the file.
echo $file->fgets();
}
// Unset the file to call __destruct(), closing the file handle.
$file = null;
If you want to use foreach instead of while when opening a big file, you probably want to encapsulate the while loop inside a Generator to avoid loading the whole file into memory:
/**
* #return Generator
*/
$fileData = function() {
$file = fopen(__DIR__ . '/file.txt', 'r');
if (!$file) {
return; // die() is a bad practice, better to use return
}
while (($line = fgets($file)) !== false) {
yield $line;
}
fclose($file);
};
Use it like this:
foreach ($fileData() as $line) {
// $line contains current line
}
This way you can process individual file lines inside the foreach().
Note: Generators require >= PHP 5.5
There is a file() function that returns an array of the lines contained in the file.
foreach(file('myfile.txt') as $line) {
echo $line. "\n";
}
The obvious answer wasn't there in all the responses.
PHP has a neat streaming delimiter parser available made for exactly that purpose.
$fp = fopen("/path/to/the/file", "r");
while (($line = stream_get_line($fp, 1024 * 1024, "\n")) !== false) {
echo $line;
}
fclose($fp);
Use buffering techniques to read the file.
$filename = "test.txt";
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer = fread($source_file, 4096); // use a buffer of 4KB
$buffer = str_replace($old,$new,$buffer);
///
}
foreach (new SplFileObject(__FILE__) as $line) {
echo $line;
}
One of the popular solutions to this question will have issues with the new line character. It can be fixed pretty easy with a simple str_replace.
$handle = fopen("some_file.txt", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
$line = str_replace("\n", "", $line);
}
fclose($handle);
}
This how I manage with very big file (tested with up to 100G). And it's faster than fgets()
$block =1024*1024;//1MB or counld be any higher than HDD block_size*2
if ($fh = fopen("file.txt", "r")) {
$left='';
while (!feof($fh)) {// read the file
$temp = fread($fh, $block);
$fgetslines = explode("\n",$temp);
$fgetslines[0]=$left.$fgetslines[0];
if(!feof($fh) )$left = array_pop($lines);
foreach ($fgetslines as $k => $line) {
//do smth with $line
}
}
}
fclose($fh);
Be careful with the 'while(!feof ... fgets()' stuff, fgets can get an error (returnfing false) and loop forever without reaching the end of file. codaddict was closest to being correct but when your 'while fgets' loop ends, check feof; if not true, then you had an error.
SplFileObject is useful when it comes to dealing with large files.
function parse_file($filename)
{
try {
$file = new SplFileObject($filename);
} catch (LogicException $exception) {
die('SplFileObject : '.$exception->getMessage());
}
while ($file->valid()) {
$line = $file->fgets();
//do something with $line
}
//don't forget to free the file handle.
$file = null;
}
<?php
echo '<meta charset="utf-8">';
$k= 1;
$f= 1;
$fp = fopen("texttranslate.txt", "r");
while(!feof($fp)) {
$contents = '';
for($i=1;$i<=1500;$i++){
echo $k.' -- '. fgets($fp) .'<br>';$k++;
$contents .= fgets($fp);
}
echo '<hr>';
file_put_contents('Split/new_file_'.$f.'.txt', $contents);$f++;
}
?>
Function to Read with array return
function read_file($filename = ''){
$buffer = array();
$source_file = fopen( $filename, "r" ) or die("Couldn't open $filename");
while (!feof($source_file)) {
$buffer[] = fread($source_file, 4096); // use a buffer of 4KB
}
return $buffer;
}
Hi I do have a text file with size of upto 30MB I would like to read this file using PHP loop script
$lines = file('data.txt');
//loop through each line
foreach ($lines as $line) { \\some function }
Is there any way? I want open it for reading php doesnt allow me to open a 30MB file.
You could read it line by line like this:
$file = fopen("data.txt", "r") or exit("Unable to open file!");
while(!feof($file)) {
// do what you need to do with it - just echoing it out for this example
echo fgets($file). "<br />";
}
fclose($file);
Read line by line using:
$handle = fopen ( "data.txt", "r" );
while ( ( $buffer = fgets ( $handle, 4096 ) ) !== false ) {
// your function on line;
}
If it is suitable for you to read the file piece-by-piece you can try something like this
$fd = fopen("fileName", "r");
while (!feof($fd)) {
$buffer = fread($fd, 16384); //You can change the size of the buffer according to the memory you can youse
//Process here the buffer, piece by piece
}
fclose($fd);