I am attempting to download fairly large files (up to, possibly over 1GB) from a remote HTTP server through a PHP script. I am using fgets() to read the remote file line by line and write the file contents into a local file that is created through tempnam(). However, the downloads of very large files (several hundred MB) are failing. Is there any way I can rework the script to catch the errors that are occurring?
Because the download is only part of a larger overall process, I would like to be able to handle the downloads and deal with errors in the PHP script rather than having to go to wget or some other process.
This is the script I am using now:
$tempfile = fopen($inFilename, 'w');
$handle = #fopen("https://" . $server . ".domain.com/file/path.pl?keyID=" . $keyID . "&format=" . $format . "&zipped=true", "r");
$firstline = '';
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
if ($firstline == '') $firstline = $buffer;
fwrite($tempfile, $buffer);
}
fclose($handle);
fclose($tempfile);
return $firstline;
} else {
throw new Exception ('Unable to open remote file.');
}
I'd say you're looking for stream_notification_callback (especially the STREAM_NOTIFY_FAILURE & STREAM_NOTIFY_COMPLETED constants)
Related
I have a 197gb text file that I want to read and push the contents into MySql database. I know, I can't put that big file in PHP buffer and read it as whole, So I want to read few hundred lines as a time and keep on reading next and next to read the whole file.
I am trying it with this but the page returns nothing
<?php
$i = 0;
$handle = fopen("./data/200gbfile.txt", "r") or die("Couldn't get handle");
if ($handle) {
while (($line = fgets($handle)) !== false) {
echo $line . "<br />";
if ($i > 100) {
exit;
}
$i++;
}
fclose($handle);
} else {
echo "Error Opeing File!";
}
?>
Is there a limit of the max file size to be handled in php setting?
EDIT: for the 197gb file in question, fopen is failing to return anything and
the output page is just going blank.
You can read the file in chunks to save memory:
For example:
$fd = #fopen("./data/200gbfile.txt", "r");
while (!feof($fd)) {
$data = fread($fd, 1024); // read the file in 1024kb chunks
// handle current data (read line by line for example)
}
fclose($fd);
But no idea if that works with a file with 100Gbytes+.
Edit: # with fopen is required as suggested by Roman.
you can use ini_set('memory_limit','16M'); to set size accordingly but i don't wether it will handle such huge file. never tested that..
foreach($streams as $stream) {
parse_str($stream, $data);
if(stripos($data['type'], $format) !== false && stripos($data['quality'], 'small') !== false) {
$video = fopen($data['url'] . '&signature='.$data['sig'], 'r');
$file = fopen($_GET['id'] . '.flv', 'w');
stream_copy_to_stream($video, $file);
fclose($video);
fclose($file);
echo echo_video($id);
exit;
}
}
I am making a YouTube downloader and for some reason, the conversion (of the smallest quality) is so small, that my server times out. Is there a way to replace these fopen()'s with file_put_contents()?
You can act as a URL proxy instead of first downloading the file then serving it. As you download and serve the data on the fly, you can also write the content on your local server in a file or as a BLOB in a database. Moreover, you can increase the timeout limit to avoid timing out.
I have an issue I can't seem to find the solution for. I am trying to write to a flat text file. I have echoed all variables out on the screen, verified permissions for the user (www-data) and just for grins set everything in the whole folder to 777 - all to no avail. Worst part is I can call on the same function from another file and it writes. I can't see to find the common thread here.....
function ReplaceAreaInFile($AreaStart, $AreaEnd, $File, $ReplaceWith){
$FileContents = GetFileAsString($File);
$Section = GetAreaFromFile($AreaStart, $AreaEnd, $FileContents, TRUE);
if(isset($Section)){
$SectionTop = $AreaStart."\n";
$SectionTop .= $ReplaceWith;
$NewContents = str_replace($Section, $SectionTop, $FileContents);
if (!$Handle = fopen($File, 'w')) {
return "Cannot open file ($File)";
exit;
}/*
if(!flock($Handle, LOCK_EX | LOCK_NB)) {
echo 'Unable to obtain file lock';
exit(-1);
}*/
if (fwrite($Handle, $NewContents) === FALSE) {
return "Cannot write to file ($File)";
exit;
}else{
return $NewContents;
}
}else{
return "<p align=\"center\">There was an issue saving your settings. Please try again. If the issue persists contact your provider.</p>";
}
}
Try with...
$Handle = fopen($File, 'w');
if ($Handle === false) {
die("Cannot open file ($File)");
}
$written = fwrite($Handle, $NewContents);
if ($written === false) {
die("Invalid arguments - could not write to file ($File)");
}
if ((strlen($NewContents) > 0) && ($written < strlen($NewContents))) {
die("There was a problem writing to $File - $written chars written");
}
fclose($Handle);
echo "Wrote $written bytes to $File\n"; // or log to a file
return $NewContents;
and also check for any problems in the error log. There should be something, assuming you've enabled error logging.
You need to check for number of characters written since in PHP fwrite behaves like this:
After having problems with fwrite() returning 0 in cases where one
would fully expect a return value of false, I took a look at the
source code for php's fwrite() itself. The function will only return
false if you pass in invalid arguments. Any other error, just as a
broken pipe or closed connection, will result in a return value of
less than strlen($string), in most cases 0.
Also, note that you might be writing to a file, but to a different file that you're expecting to write. Absolute paths might help with tracking this.
The final solution I ended up using for this:
function ReplaceAreaInFile($AreaStart, $AreaEnd, $File, $ReplaceWith){
$FileContents = GetFileAsString($File);
$Section = GetAreaFromFile($AreaStart, $AreaEnd, $FileContents, TRUE);
if(isset($Section)){
$SectionTop = $AreaStart."\n";
$SectionTop .= $ReplaceWith;
$NewContents = str_replace($Section, $SectionTop, $FileContents);
return $NewContents;
}else{
return "<p align=\"center\">There was an issue saving your settings.</p>";
}
}
function WriteNewConfigToFile($File2WriteName, $ContentsForFile){
file_put_contents($File2WriteName, $ContentsForFile, LOCK_EX);
}
I did end up using absolute file paths and had to check the permissions on the files. I had to make sure the www-data user in Apache was able to write to the files and was also the user running the script.
As I start the process of writing my site in PHP and MySQL, one of the first PHP scripts I've written is a script to initialize my database. Drop/create the database. Drop/create each of the tables. Then load the tables from literals in the script.
That's all working fine! Whoohoo :-)
But I would prefer to read the data from files rather than hard-code them in the PHP script.
I have a couple of books on PHP, but they're all oriented toward web development using MySQL. I can't find anything about reading and writing to ordinary files.
Yes, I know there's a gazillion questions here on stackoverflow about reading TXT files, but when I look at each one, they're for C or C# or VB or Perl. I'm beginning to think that PHP just can't read files :-(
All I need is a brief PHP example of how to open a TXT file on the server, read it sequentially, display the data on the screen, and close the file, as in this pseudo-code:
program readfile;
handle = open('myfile.txt');
data = read (handle);
while (not eof (handle)) begin
display data;
data = read (handle);
end;
close (handle);
end;
I will also need to write files on the server when I get to the part of my site where people upload avatars, and save them as JPG or GIF files. But that's for later.
Thanks!
From the PHP manual for fread():
<?php
// get contents of a file into a string
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
?>
EDIT
per the comment, you can read a file line by line with fgets()
<?php
$handle = #fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (($buffer = fgets($handle, 4096)) !== false) {
echo $buffer;
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
?>
All I need is a brief PHP example of how to open a TXT file on the server, read it sequentially, display the data on the screen, and close the file, as in this pseudo-code:
echo file_get_contents('/path/to/file.txt');
Yes that brief, see file_get_contents, you normally don't need a loop:
$file = new SPLFileObject('/path/to/file.txt');
foreach($file as $line) {
echo $line;
}
Well, since you're asking about resources on the subject, there's a whole book on it in the PHP.net docs.
A basic example:
<?php
// get contents of a file into a string
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
$contents = fread($handle, filesize($filename));
fclose($handle);
?>
Why you not read php documentation about fopen
$file = fopen("source/file.txt","r");
if(!file)
{
echo("ERROR:cant open file");
}
else
{
$buff = fread ($file,filesize("source/file.txt"));
print $buff;
}
file_get_contents does all that for you and returns the text file in a string :)
You want to read line by line? Use fgets.
$handle = #fopen("myfile.txt", "r");
if ($handle) {
while (($content = fgets($handle, 4096)) !== false) {
//echo $content;
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
I'm using a simple unzip function (as seen below) for my files so I don't have to unzip files manually before they are processed further.
function uncompress($srcName, $dstName) {
$string = implode("", gzfile($srcName));
$fp = fopen($dstName, "w");
fwrite($fp, $string, strlen($string));
fclose($fp);
}
The problem is that if the gzip file is large (e.g. 50mb) the unzipping takes a large amount of ram to process.
The question: can I parse a gzipped file in chunks and still get the correct result? Or is there a better other way to handle the issue of extracting large gzip files (even if it takes a few seconds more)?
gzfile() is a convenience method that calls gzopen, gzread, and gzclose.
So, yes, you can manually do the gzopen and gzread the file in chunks.
This will uncompress the file in 4kB chunks:
function uncompress($srcName, $dstName) {
$sfp = gzopen($srcName, "rb");
$fp = fopen($dstName, "w");
while (!gzeof($sfp)) {
$string = gzread($sfp, 4096);
fwrite($fp, $string, strlen($string));
}
gzclose($sfp);
fclose($fp);
}
try with
function uncompress($srcName, $dstName) {
$fp = fopen($dstName, "w");
fwrite($fp, implode("", gzfile($srcName)));
fclose($fp);
}
$length parameter is optional.
If you are on a Linux host, have the required privilegies to run commands, and the gzip command is installed, you could try calling it with something like shell_exec
SOmething a bit like this, I guess, would do :
shell_exec('gzip -d your_file.gz');
This way, the file wouldn't be unzip by PHP.
As a sidenote :
Take care where the command is run from (ot use a swith to tell "decompress to that directory")
You might want to take a look at escapeshellarg too ;-)
As maliayas mentioned, it may lead to a bug. I experienced an unexpected fall out of the while loop, but the gz file has been decompressed successfully. The whole code looks like this and works better for me:
function gzDecompressFile($srcName, $dstName) {
$error = false;
if( $file = gzopen($srcName, 'rb') ) { // open gz file
$out_file = fopen($dstName, 'wb'); // open destination file
while (($string = gzread($file, 4096)) != '') { // read 4kb at a time
if( !fwrite($out_file, $string) ) { // check if writing was successful
$error = true;
}
}
// close files
fclose($out_file);
gzclose($file);
} else {
$error = true;
}
if ($error)
return false;
else
return true;
}