I am using the following code to control the bandwidth usage through download link. Here is the code that i am using with implementation of QOS Bandwidth Throttle PHP
// create new config
$config = new ThrottleConfig();
// enable burst rate for 30 seconds
$config->burstTimeout = 30;
// set burst transfer rate to 50000 bytes/second
$config->burstLimit = 10000;
// set standard transfer rate to 15.000 bytes/second (after initial 30 seconds of burst rate)
$config->rateLimit = 15000;
// enable module (this is a default value)
$config->enabled = true;
// start throttling
$x = new Throttle($config);
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/zip");
header("Content-Disposition: attachment; filename=\"".$zipname."\"");
header("Content-Transfer-Encoding: binary");
header("Content-type: application/force-download");
header("Content-Disposition: attachment; filename=\"".$zipname."\"");
header("Content-Length: ".filesize($directory_location . '/' . $zipname));
I am getting corrupted file, no actual size(4MB) and i get approximate (2KB) size. And if i use the readfile() function then i didn't found the throttle class working with readfile() :(
Can anyone please tell me, what wrong i have done here?
Try this:
$x = new Throttle($config);
$handle = fopen("yourfile.zip", "rb");
while (!feof($handle)) {
echo fread($handle, 8192);
flush();
}
fclose($handle);
You can also try ob_flush() instead of flush().
Related
I am trying to download 700mb file using the following code
header("Pragma: public", true);
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=".basename($File));
header("Content-Type: application/download");
header("Content-Description: File Transfer");
header("Content-Length: " . filesize('movies/'.$File));
flush();
$fp = fopen('movies/'.$File, "r");
while (!feof($fp))
{
echo fread($fp, 65536);
flush();
}
fclose($fp);
but this is taking too long to start downloading file, it takes about 2-3 minutes to start the download. This code is just for mkv file, other video files are working perfect with location header.
File is not downloading without any of the content-type header missing
I want to start downloading fast
It seems I have a problem with file downloading. My logs show the error "Maximum execution time of 60 seconds exceeded" but the requested file is just a little css-file with only 1.64 KB. So it shouldn't take 60 seconds to deliver and unfortunately the error is not exactly reproducable. If I open the url it works perfectly, but my errorlog shows errors occuring (randomly?) on other clients several times. Is there a bug in my code?
// this code is from: http://www.richnetapps.com/the-right-way-to-handle-file-downloads-in-php/
// fix for IE catching or PHP bug issue
header("Pragma: public");
header("Expires: -1"); // set expiration time
header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); // browser must download file from server instead of cache
if(substr($filename, -4) == ".css")
{
$mimeType = "text/css";
}
header("Content-Type: ".$mimeType);
header("Content-length: ".$filesize);
$filehandle = fopen($filename, "rb");
// large file handling:
while(!feof($filehandle))
{
print(#fread($filehandle, 1024*8));
ob_flush();
flush();
if(connection_status() != 0)
{
#fclose($filehandle);
unlink($filename);
exit;
}
}
#fclose($filehandle);
unlink($filename);
exit;
The errorline is always within the while-loop, but it's not always the same line.
Thanks for help! :)
Why all this OB fuss for a simple css output? I am sure its that connection status check which causes your request to hang sometimes.
if(connection_status() != 0) // that specifically
Why do you even need it? You can simply do
header("Pragma: public");
header("Expires: -1"); // set expiration time
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Type: text/css");
header("Content-length: ".$filesize);
readfile($filename);
I am facing some problems while generating PDF reports from my application in firefox(ubuntu machine).
my code is:
<?php
$path = "path_to_file" ;
$file = "test.pdf";
header('Content-type: application/pdf');
header("Content-disposition: attachment; filename=$file");
readfile($path);
return new Response("Success");
code is working:
when the PDF-report contains data
when size is more than 1 kb
In windows machine
not working:
when report contains no data or size is in bytes.
It is Generating a binary sting in a new tab in browser instead of generating blank-PDF.
Please help me fixing this issue.Thanks in advance.
Got my answer.This may help others.
<?php
$path = "path_to_file" ;
$file = "test.pdf";
header("Content-disposition: attachment; filename= $file"); //Tell the filename to the browser
header("Content-type: application/pdf");//Get and show report format
header("Content-Transfer-Encoding: binary");
header("Accept-Ranges: bytes");
readfile($path); //Read and stream the file
get_curret_user();
I would do it in this way to avoid some problems with browser applications and caching. Give it a try:
<?php
$path = "path_to_file" ;
$file = "test.pdf";
$content = file_get_contents($path);
header("Content-disposition: attachment; filename=\"".$file."\"");
header("Content-type: application/force-download");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".strlen($content));
header("Pragma: no-cache");
header("Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0");
header("Expires: 0");
echo $content;
die();
EDIT:
If it's really necessary to use the pdf plugin of the browser instead of downloading and opening it immediately with the associated default pdf reader, please replace
header("Content-type: application/force-download");
with
header("Content-type: application/pdf");
I have problem with a php managed file download where the browser do no show progress of a file download. In fact, the browser appears to be waiting and waiting and waiting, until the file is completely downloaded. The file will then appear in the download list (with chrome and firefox). I cannot even download the file with IE8. I would like the browser to show the actual file size and the progress of the download.
Strangely the download is not even visible in firebug (no line appear in the network tab if you paste the download url).
I suspected problem with compression/zlib so I disabled both: no change. I disabled output buffering with the same result.
Live example can be found here: http://vps-1108994-11856.manage.myhosting.com/download.php
Phpinfo: http://vps-1108994-11856.manage.myhosting.com/phpinfo.php
The code is below, your help is appreciated.
<?php
$name = "bac.epub";
$publicname = "bac.epub";
#apache_setenv('no-gzip', 1);
ini_set("zlib.output_compression", "Off");
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($name));
header("Content-disposition: attachment; filename=" . $publicname) );
ob_end_flush();
flush();
// dump the file and stop the script
$chunksize = 1 * (128 * 1024); // how many bytes per chunk
$size = filesize($name);
if ($size > $chunksize) {
$handle = fopen($name, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
sleep(1);
}
fclose($handle);
} else {
readfile($name);
}
exit;
The sleep in the code was to ensure that the download is long enough to see the progress.
Keep it, really, really, simple.
<?php
header("Content-Type: application/epub+zip");
header("Content-disposition: attachment; filename=" . $publicname) );
if(!readfile($name))
echo 'Error!';
?>
It is all you really need.
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_path));
header("Content-disposition: attachment; filename=" . $local_file_name);
// dump the file and stop the script
$chunksize = 128 * 1024; // how many bytes per chunk (128 KB)
$size = filesize($file_path);
if ($size > $chunksize)
{
$handle = fopen($file_path, 'rb');
$buffer = '';
while (!feof($handle))
{
$buffer = fread($handle, $chunksize);
echo $buffer;
flush();
sleep(1);
}
fclose($handle);
}
else
{
readfile($file_path);
}
I have modified your code Francis. And now it works ... :)
This is likely caused by a firewall or some sort of proxy between you and the remote site. I was wrestling with the same problem - disabling gzip, flushing buffers etc. until I tried it under a web VPN and the progress indicator re-appeared.
I don't think that the progress indicator is buggy - it's just that the content is being embargoed before it gets to you, which appears as a waiting state in the download. Then when the content is scanned or approved, it may come down very quickly relative to the normal download speed of your site. For large enough files, maybe you could see a progress indicator at this stage.
Nothing you can do about it except to determine if this is the real reason for this behaviour.
This question already has answers here:
Downloading large files reliably in PHP
(13 answers)
Closed 9 years ago.
Using PHP, I am trying to serve large files (up to possibly 200MB) which aren't in a web accessible directory due to authorization issues. Currently, I use a readfile() call along with some headers to serve the file, but it seems that PHP is loading it into memory before sending it. I intend to deploy on a shared hosting server, which won't allow me to use much memory or add my own Apache modules such as X-Sendfile.
I can't let my files be in a web accessible directory for security reasons. Does anybody know a method that is less memory intensive which I could deploy on a shared hosting server?
EDIT:
if(/* My authorization here */) {
$path = "/uploads/";
$name = $row[0]; //This is a MySQL reference with the filename
$fullname = $path . $name; //Create filename
$fd = fopen($fullname, "rb");
if ($fd) {
$fsize = filesize($fullname);
$path_parts = pathinfo($fullname);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf":
header("Content-type: application/pdf");
break;
case "zip":
header("Content-type: application/zip");
break;
default:
header("Content-type: application/octet-stream");
break;
}
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 1*(1024*1024));
echo $buffer;
ob_flush();
flush(); //These two flush commands seem to have helped with performance
}
}
else {
echo "Error opening file";
}
fclose($fd);
If you use fopen and fread instead of readfile, that should solve your problem.
There's a solution in the PHP's readfile documentation showing how to use fread to do what you want.
To download large files from server, I have changed the below settings in php.ini file:
Upload_max_filesize - 1500 M
Max_input_time - 1000
Memory_limit - 640M
Max_execution_time - 1800
Post_max_size - 2000 M
Now, I am able to upload and download 175MB video on server.
Since, I have the dedicated server. So, making these changes were easy.
Below is the PHP script to download the file. I have no made any changes in this code snippet for large file size.
// Begin writing headers
ob_clean(); // Clear any previously written headers in the output buffer
if($filetype=='application/zip')
{
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
$fp = #fopen($filepath, 'rb');
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header("Content-Transfer-Encoding: binary");
header('Pragma: public');
header("Content-Length: ".filesize(trim($filepath)));
}
else
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Pragma: no-cache');
header("Content-Length: ".filesize(trim($filepath)));
}
fpassthru($fp);
fclose($fp);
}
elseif($filetype=='audio'|| $filetype=='video')
{
global $mosConfig_absolute_path,$my;
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: application/force-download");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
#readfile($filepath);
}
else{ // for all other types of files except zip,audio/video
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
#readfile($filepath);
}
exit;
If you care about performance, there is xsendfile, available in apache, nginx and lighttpd as module. Check the readfile() doc's users comments.
There are also modules for these webservers which accept a url with an additional hash value which allows downloading the file for a short time period. This can be also used to solve authorization issues.
You could also handle this in the style of the Gordian Knot - that is to say, sidestep the problem entirely. Keep the files in a non-accessible directory, and when a download is initiated you can simply
$tempstring = rand();
symlink('/filestore/filename.extension', '/www/downloads' . $tempstring . '-filename.extension');
echo('Your download is available here: <a href="/downloads/' . $tempstring . '-filename.extension">');
and setup a cronjob to unlink() any download links older than 10 minutes. Virtually no processing of your data is required, no massaging of HTTP headers, etc.
There are even a couple libraries out there for just this purpose.