I am having some major problems when with a script I have to download files from S3.
The problem I'm encountering is that every time I try to download a file, the download starts perfectly, but about halfway through a download the file just stops. Every time. On every file. These are video files, so a lot of them are significantly big. Not sure what to do or how to approach this. Here's my script:
<?php
// other code exists; this is the main download logic
set_time_limit(0);
ignore_user_abort(false);
ini_set('output_buffering', 0);
ini_set('zlib.output_compression', 0);
$chunk = 10 * 1024 * 1024; // bytes per chunk (10 MB)
$fh = fopen($video->getMp4Source(), "rb");
if ($fh === false) {
echo "Unable open file";
}
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header('Expires: 0');
header('Pragma: public');
header('Content-Description: File Transfer');
header('Content-type: MP4');
header('Content-length: ' . $video->getFileSize());
header('Content-Disposition: attachment; filename="'.$video->getMp4Source().'"');
while (!feof($fh)) {
echo fread($fh, $chunk);
ob_flush(); // flush output
flush();
}
exit;
I'm not sure what is wrong, or why it keeps happening. There are no errors being logged, and no errors are occurring outside from the file failing to complete its download. I have tried this with readfile(), but it was using up too many resources and it wasn't completing, regardless.
Any help would be great.
Related
I am trying to download large zip files (800MB to 1GB) containing audio files to the browser. As I have seen so far, chunking seems to be the most popular approach, but I am having zero luck. The code I have been working with is
$filename = $_SERVER['DOCUMENT_ROOT'] . $filepath;
$download_rate = 5000;
$progress = 0;
if (file_exists($filename)) {
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($filename));
header('Content-Disposition: filename='.basename($filename));
flush();
$file = fopen($filename, "r");
while(!feof($file)) {
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
fclose($file);
} else echo 'File does not exist!';
This works for a wide variety of file types and sizes up to a certain point -- I have no problem downloading typical PDFs, etc. But the browser just spins and spins when I try to download large zip files, and eventually dies (around 2 minutes). I am really needing some help here. I've tried a number of code variants for chucking, but something always seems to die and I'm not sure what I'm doing wrong. Perhaps there's a better approach to chunking?
Consider change change for readfile will not present any memory issues, your lost download rate and progress show in browser. your limit download rate in .httaccess.
$filename = $_SERVER['DOCUMENT_ROOT'] . $filepath;
if (file_exists($filename)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.basename($filename).'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: '.filesize($filename));
readfile($filename);
} else echo 'File does not exist!';
https://www.php.net/manual/en/function.readfile.php
I am working on a project where I get a file stream and write this file to the servers local disk.
I then want PHP to download it but instead it just dumps out the data of the file to the page.
Below is how I am writing the file and trying to tell PHP to download it
$settingsManager = new SettingsManager();
$this->tempWriteLocation = $settingsManager->getSpecificSetting("hddFileWriterLocation");
$downloadUrl = $settingsManager->getSpecificSetting("tempFileUrlDownload") . "/$this->tempFileName";
if (!$this->checkIfDirectoryExists())
{
throw new Exception("Failed to create temp write directory: $this->tempWriteLocation");
}
$filePathAndName = "$this->tempWriteLocation\\$this->tempFileName";
$fh = fopen($filePathAndName, "w");
if (!$fh)
{
throw new Exception("Failed to open file handle for: $filePathAndName. " . error_get_last());
}
fwrite($fh, $this->fileData);
fclose($fh);
//return $downloadUrl;
header('Content-Description: File Transfer');
header('Content-Type: audio/wav');
header('Content-Disposition: attachment; filename='.basename($filePathAndName));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($filePathAndName));
ob_clean();
flush();
readfile($filePathAndName);
When the above code being run, I get the following output (only a snippet)
RIFF\tWAVELIST2INFOISFT%Aculab Media System Server V2.3.4b11fmt
##fact�sdata�sUU������UUUUUU�UUU��U���UU��UUU�UUUU���UU���UU�����UU
Just so you know the diamonds are actual output I get back, not anything wrong with Stack Overflow displaying something properly.
I've tried setting the content-type to be force-download but doesn't make any difference.
Try this header:
header('Content-type: audio/x-wav', true);
header('Content-Disposition: attachment;filename=wav-filename.wav');
and see if this works. From what I see you have you code formation setup correctly. Fixing the headers should download the file automatically.
I'm trying to serve an image while adding a MySQL row for each second the image was viewed.
I'm serving it in chunks of 1024 bits (total size is of image is 20kb)
The problem is that if I load the page where the image is displayed and then close the window or click a link that takes me to a different page the script keeps running and does not die as it should.
ignore_user_abort(false);
$file = 'a.jpg';
header('Content-Type: image/jpeg');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
$conn = mysql_connect("localhost","user","pass");
mysql_select_db("mydb",$conn);
$fp=fopen($file,"rb");
$i=0;
while (!feof($fp)) {
print(fread($fp,1024));
sleep(1);
mysql_query("INSERT INTO table (VIEWTIME) VALUES ('$i')");
$i++;
flush();
ob_flush();
if (connection_aborted()) {
die();
}
}
I'm trying to find a 'server-side' only solution since I have some technical restrictions that prevents me from using any JS or any client side languages.
Perhaps
ignore_user_abort(false);
Should be
ignore_user_abort(true);
Docs
I have problem with a php managed file download where the browser do no show progress of a file download. In fact, the browser appears to be waiting and waiting and waiting, until the file is completely downloaded. The file will then appear in the download list (with chrome and firefox). I cannot even download the file with IE8. I would like the browser to show the actual file size and the progress of the download.
Strangely the download is not even visible in firebug (no line appear in the network tab if you paste the download url).
I suspected problem with compression/zlib so I disabled both: no change. I disabled output buffering with the same result.
Live example can be found here: http://vps-1108994-11856.manage.myhosting.com/download.php
Phpinfo: http://vps-1108994-11856.manage.myhosting.com/phpinfo.php
The code is below, your help is appreciated.
<?php
$name = "bac.epub";
$publicname = "bac.epub";
#apache_setenv('no-gzip', 1);
ini_set("zlib.output_compression", "Off");
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($name));
header("Content-disposition: attachment; filename=" . $publicname) );
ob_end_flush();
flush();
// dump the file and stop the script
$chunksize = 1 * (128 * 1024); // how many bytes per chunk
$size = filesize($name);
if ($size > $chunksize) {
$handle = fopen($name, 'rb');
$buffer = '';
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
sleep(1);
}
fclose($handle);
} else {
readfile($name);
}
exit;
The sleep in the code was to ensure that the download is long enough to see the progress.
Keep it, really, really, simple.
<?php
header("Content-Type: application/epub+zip");
header("Content-disposition: attachment; filename=" . $publicname) );
if(!readfile($name))
echo 'Error!';
?>
It is all you really need.
header("Content-Type: application/epub+zip");
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file_path));
header("Content-disposition: attachment; filename=" . $local_file_name);
// dump the file and stop the script
$chunksize = 128 * 1024; // how many bytes per chunk (128 KB)
$size = filesize($file_path);
if ($size > $chunksize)
{
$handle = fopen($file_path, 'rb');
$buffer = '';
while (!feof($handle))
{
$buffer = fread($handle, $chunksize);
echo $buffer;
flush();
sleep(1);
}
fclose($handle);
}
else
{
readfile($file_path);
}
I have modified your code Francis. And now it works ... :)
This is likely caused by a firewall or some sort of proxy between you and the remote site. I was wrestling with the same problem - disabling gzip, flushing buffers etc. until I tried it under a web VPN and the progress indicator re-appeared.
I don't think that the progress indicator is buggy - it's just that the content is being embargoed before it gets to you, which appears as a waiting state in the download. Then when the content is scanned or approved, it may come down very quickly relative to the normal download speed of your site. For large enough files, maybe you could see a progress indicator at this stage.
Nothing you can do about it except to determine if this is the real reason for this behaviour.
I used this code for downloading files
$file='test.mp3';
$download_rate = 50; //50 kb/s
if(file_exists($file) && is_file($file))
{
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($file));
header('Content-Disposition: filename='.$file);
flush();
$file = fopen($file, "r");
while(!feof($file))
{
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
fclose($file);
}
else {
echo 'File Not Found';
}
but while downloading the file cannot browse the site till the download completed. this happened with IE and Firefox
Any answers?
Only time I know this happens is when you have sessions which have not been written.
I can't see any sessions here so I'm not sure what is causing it.
However, most php download file scripts are used to check for logins so I'm guessing this is the case.
if you do have sessions, try session_write_close();