I want to handle big files(1-2Gb) downloads with php script and found two ways to do that:
with file_get_contents()
with readfile()
and use this implementation:
header('Content-type: ' . $string);
header('Content-disposition: attachment; filename=' . $info['filename']);
$file = file_get_contents($filename);
echo $file;
or
readfile($filename);
But it takes too long to output the file. I suppose that the whole file has to be readed, before the output starts.
It is more quickly when point the exact location of the file. Then it starts the output almost immediately.
I am looking for solution that streams the file or something. Any ideas?
You should consider using mod_xsendfile
$handle = fopen($this->_path, 'rb');
while(!feof($handle))
{
echo fread($handle, 4096);
ob_flush();
flush();
}
fclose($handle);
from PHP Readfile() not working for me and I don't know why
Related
Am using the code below to download a file. it uses flush() and ob_flush() functions. I read brenns10 comment's at this
link
It says that use of flush() and ob_flush() will cause the data to go in memory until it's displayed and as such that its not good
for server with limited resources. Am on a shared server.
Please I need an explanation on this. should I flush() and ob_flush() as its in the code below or should I remove it. Thanks
$download_file = '10gb_file.zip';
$chunk = 1024; // 1024 kb/s
if (file_exists($download_file) && is_file($download_file)) {
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: ' . filesize($download_file));
header('Content-Disposition: filename=' . $download_file);
$file = fopen($download_file, 'r');
while(!feof($file)) {
print fread($file, round($chunk * 1024));
ob_flush();
flush();
}
fclose($file);
}
The better option when sending large files is to turn off buffering in PHP and allow your web server or underlying CGI layer to handle it, as they're better equipped to deal with large output streams, with techniques such as writing to temporary files, or delegating it into the socket.
If you have already started an output buffer elsewhere in the code, you would want to first close it using ob_end_clean().
// removes anything which was in the buffer, as this might corrupt your download
ob_end_clean();
// sends the file data to the server without trying to copy it into memory
$fp = fopen($download_file, 'rb');
fpassthru($fp);
I am restricting files from being downloaded without authorization, so they can only be downloaded by url such as mysite.com/getfile.php?file=32tf2376r327yf. I've read that using readfile() can work, but will fill the buffer with the entire file prior to sending it. If 100 users all request it at the same time, that will exhaust memory.
Instead, I want to write out chunks/lines of the file as they're recieved and then fulsh and clear the buffer so that minimal memory is used at any time. How would I do this?
PSEUDO-CODE
function sendFile($someFile)
{
//Start the buffer
ob_start();
//Send the headers
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename="test.pdf"');
header('Content-length: ' . (string) (filesize($someFile)));
...etc...
//Get the file (Should I be using "b" mode?)
$handle = fopen($someFile, "rb");
if ($handle) {
//Loop through data, one part at a time
while (($buffer = fgets($handle, 4096)) !== false) {
//Send data
echo $buffer;
//Flush the buffer
ob_flush();
//Clear buffer
ob_clean();
}
//Error...
if (!feof($handle)) echo "Error: unexpected fgets() fail\n";
fclose($handle);
//Make sure everything's sent and clear the buffer
ob_end_flush();
}
}
What else do I need to do to achieve what I'm trying to do? Is it possible in PHP?
if you really care for the memory, and you have a reason to do so, then use nginx as a web-server and php-fpm as PHP API.
with such a setup there will be absolutely no reason to worry about any memory problems.
I'm having no luck serving a .dmg from my online store. I stripped down the code to just the following to debug, but no matter what I get a zero-byte file served:
header('Content-Type: application/x-apple-diskimage'); // also tried octet-stream
header('Content-Disposition: attachment; filename="My Cool Image.dmg"');
$size = filesize('/var/www/mypath/My Cool Image.dmg');
header('Content-Length: '.$size);
readfile('/var/www/mypath/My Cool Image.dmg');
This same code works for a number of other file types that I serve: bin, zip, pdf. Any suggestions? Professor Google is not my friend.
Found the solution. The culprit was readfile() and may have been memory related. In place of the readfile() line, I'm using this:
$fd = fopen ('/var/www/mypath/My Cool Image.dmg', "r");
while(!feof($fd)) {
set_time_limit(30);
echo fread($fd, 4096);
flush();
}
fclose ($fd);
It's now serving all filetypes properly, DMG included.
You should not have spaces in the filename (Spaces should not be used when it comes to web hosted files)
Try something like this or rename your file with no spaces:
<?php
$path ='/var/www/mypath/';
$filename = 'My Cool Image.dmg';
$outfile = preg_replace('/[^a-zA-Z0-9.-]/s', '_', $filename);
header('Content-Type: application/x-apple-diskimage'); // also tried octet-stream
header('Content-Disposition: attachment; filename="'.$outfile.'"');
header('Content-Length: '.sprintf("%u", filesize($file)));
readfile($path.$filename); //This part is using the real name with spaces so it still may not work
?>
im having a problem when downloading a csv file using php, ideally i want to visit a page and it just prom,pts me to save the file-
i have tried this here-
$filename = 'dump.csv';
header("Content-type: text/csv");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
and it has been returning a empty file, even though when i do-
echo filesize($filename);
it returns 19000 ?
You have to read the content of the file in and then output it.
Otherwise you are just sending the headers telling the browser to expect the file.
You would need to echo out the contents of the file. For example, after your header functions you would have something like:
$fh = fopen($filename, "r");
while($line = fgets($fh)){
echo $line;
}
fclose($fh);
I have a issue, I have a PHP script that compresses images in a Zip file and then forcing the zip file to download. Now my issue is it's not showing in IE how much space has been downloaded sofar. Eg 2MB's out of 20MB's..... 15secs remaining. Firefox works perfect.
My code
header("Content-Disposition: attachment; filename=" . urlencode($zipfile));
header("Content-Type: application/x-zip-compressed");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($filename));
$fd = fopen($filename,'r');
$content = fread($fd, filesize($filename));
print $content;
The main issue, as noted in the comment, is that you are storing the whole file into memory prior to sending which may cause a very long wait depending on file size. What you would prefer to do is output the buffer as segments are read into memory, in 1024 byte blocks, for example.
You could try something more along the lines of:
if ($file = fopen($filename, 'rb')) {
while(!feof($file)) {
print(fread($file, 1024*8));
flush();
}
fclose($file);
}
Which will not attempt to read the entire file prior to output, but rather output blocks of the file as they are read into the output buffer.