PHP download / stream file to browser - max execution timeout error - php

Situation: we store files on Amazon S3 and write the infos into our database. Now the user can download the files by a download link like www.myserver.de/downloadFile?id=1234
This works quite well, but we get several "Maximum execution time of 800 seconds exceeded" errors.
Our current code for the download:
header("Pragma: public");
header("Expires: -1");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
if($mimeType)
{
header("Content-Type: ".$mimeType);
}
else
{
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=".$filename.";");
}
header("Content-length: ".$filesize);
$s3Client = new s3Client();
//download the file from s3 to local server
$s3Client->getFile($folder, $filename, $tempFolder.$filename);
$filehandle = fopen($tempFolder.$filename, "rb");
while(!feof($filehandle))
{
print(#fread($filehandle, 1024*8));
ob_flush();
flush();
if(connection_status() != 0)
{
#fclose($filehandle);
unlink($tempFolder.$filename);
exit;
}
}
#fclose($filehandle);
unlink($tempFolder.$filename);
exit;
The max execution time error happens even with very small files (just some KB). It looks like there is a problem in while loop and it wont be terminated?
The strange thing is: the download works anyway?
When we get the timeout error in our log, we try the download ourselves and it works perfectly.
The source for this way of downloading files is: http://www.media-division.com/the-right-way-to-handle-file-downloads-in-php/

Related

How to enable download progress bar in a web browser during file download in PHP script?

I have the following PHP script that outputs a file for download to an end-user:
session_write_close();
ob_end_clean();
set_time_limit(0);
if($file = fopen($path, 'rb'))
{
header("Cache-Control: no-cache");
header("Pragma: no-cache");
header("Accept-Ranges: bytes");
header("Content-length: ".(string)(filesize($path)));
header("Content-Disposition: attachment; filename=\"".basename($path)."\"");
header("Content-type: application/octet-stream");
header("Content-Transfer-Encoding: binary");
header("Connection: close");
fpassthru($file);
fclose($file);
}
The user can download the file just fine, but when a large file is being downloaded the web browser doesn't show a progress bar with the percentage of the file remaining to download. It only displays the number of MB currently downloaded.
So I'm curious, what shall I change in the header to enable this download progress?
PS. Here's a screenshot:
Assuming it is gzip and tied to not knowing actual content length... maybe add this header?
header("accept-encoding: gzip;q=0,deflate;q=0");

PHP force download of file - either getting download interruped or system says invalid/corrupt file

What I am trying to implement is force download of a file through PHP.
There are 2 issues that I am facing, and I've broken my head thinking about what is possibly going wrong here.
Whenever I try to download the file using IE, the download gets interrupted about midway i.e. say my file is 1024Kb in size, around 500Kb is when the download stops as I get an error message in IE say 'Download was interrupted'
The other issue that I encounter frequently (but not always) is that the downloaded file (which is actually a zip file) gets corrupted at times for some reason! The file on the server is alright - if I download it directly from there, no issues at all. However, if I download using my PHP script and then try to unzip the file - Windows says 'Invalid or corrupt file...'
I really need some help on this.
Following is the block of code downloading the file:-
$fileName = "./SomeDir/Somefile.zip";
$fsize = filesize($fileName);
set_time_limit(0);
// required for IE, otherwise Content-Disposition may be ignored
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
// set headers
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header('Content-type: application/zip');
header("Content-Disposition: attachment; filename=\"".basename($fileName)."\";");
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
header("Content-Length: " . $fsize);
// download
$file = #fopen($fileName,"rb");
if ($file) {
while(!feof($file)) {
print(fread($file, 1024*8));
flush();
if (connection_status()!=0) {
#fclose($file);
die();
}
}
#fclose($file);
}
Try this
<?php
$file = "./SomeDir/Somefile.zip";
set_time_limit(0);
$name = "Somefile.zip";
// Print headers and print to output.
header('Cache-Control:');
header('Pragma:');
header("Content-type: application/zip");
header("Content-Length: ". filesize($file));
header("Content-Disposition: attachment; filename=\"{$name}\"");
readfile($file);
?>
I think that your content-length reports more bytes than the amount really transferred.
Check in the server's log. The server could gzip the content and when it closes the connection the client is still waiting for the remaining declared bytes.

PHP file download problem: Server freezes for several minutes when user cancels a download then retries

I'm using the following the PHP script to download a 20mb file:
(filepath & filename are set earlier in the script)
$fullPath = $filepath.$filename;
if ($fd = fopen($fullPath, "r")) {
// http headers for zip downloads
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"".$filename."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($fullPath));
while(!feof($fd)) {
echo(fread($fd, 1024*8));
flush();
if (connection_status()!=0) {
fclose($fd);
die();
}
}
}
fclose($fd);
exit();
It works fine if the download finishes, but if the download is canceled by the user, and they click on the link to re-download, the server is completely unresponsive for several minutes, and then the download will begin. It seems like it is waiting for the script to time out...but isn't the "if (connection_status()!=0)..." part supposed to kil the script?
Any help is much appreciated! Thank you in advance!
I think you're over-engineering your solution somewhat. Try using readfile() instead of your fopen/fread loop.
Better yet, unless there's a compelling reason why you need PHP to mediate the file transfer, don't use a PHP script at all and simply provide a direct link to the file in question.
The best code is always the code you don't have to write.

how to increase download speed while downloading mp4 video through php script

I'm using a script to download video, but it take lot of time to download. Are there any processes or other scripts that could help me?
// set headers
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: $mtype");
header("Content-Disposition: attachment; filename=\"$asfname\"");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . $fsize);
// download
// #readfile($file_path);
$file = #fopen($file_path,"rb");
if ($file) {
while(!feof($file)) {
print(fread($file, 1024*100));
flush();
if (connection_status()!=0) {
#fclose($file);
die();
}
}
#fclose($file);
}
Using the readfile() function (as you originally had) will allow you to spool directly from the file to output, rather than using a chunking loop and printing as you're doing. So why have you chosen to do this chunk loop?
As above, readfile() is one way.
The other, even more preferred method depends on your webserver. NginX, Lighttpd and there's also a module for Apache, allows you to pass a header with a filepath/name to the server, and it will send the file directly from the server itself, and so not need to use PHP resources to do it. If thats not possible, then readfile() is the best you probably have - if you can't just give someone a direct URL to download it.

File Handler returning garbled files

For the past 3 months my site has been using a PHP file handler in combination with htaccess.
Users accessing the uploads folder of the site would be redirected to the handler as such:
RewriteRule ^(.+)\.*$ downloader.php?f=%{REQUEST_FILENAME} [L]
The purpose of the file handler is pseudo coded below, followed by actual code.
//Check if file exists and user is downloading from uploads directory; True.
//Check against a file type white list and set the mime type(); $ctype = mime type;
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"".basename($filename)."\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($filename));
readfile("$filename");
As of yesterday, the handler started returning garbled files, unreadable images, and had to be bypassed. I'm wondering what settings could have gone awry to cause this.
-EDIT-
Problem found, but not resolved. Including a path to a php library I was using for integrating with Wordpress was corrupting the files. Removing that block of code solves the corruption issue but leaves the files accessible without the desired authentication.
#include_once($_SERVER['DOCUMENT_ROOT'].'/wp-blog-header.php');
if(!is_user_logged_in())
{
auth_redirect(); //Kicks the user to a login page.
}
//resume download script
Maybe more tests will reveal the problem...
if ( !isset($filename) ) {
die('parameter "filename" not set');
}
else if ( !file_exists($filename) ) {
die('file does not exist');
}
else if ( !is_readable($filename) ) {
die('file not readable');
}
else if ( false===($size=filesize($filename)) ) {
die('stat failed');
}
else if ( headers_sent() || ob_get_length()>0) {
die('something already sent output.');
}
else {
$basename = basename($filename);
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: $ctype");
header("Content-Disposition: attachment; filename=\"".$basename."\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".$size);
readfile($filename);
}
How are the files corrupted? Truncated? 0-byte? Completely different content? Random sections replaced with garbage?
Is it possible the server's PHP memory limit has been lowered? readfile() will buffer the whole file in memory before outputing it. Therefore a 40meg file will fail is the memory limit is 39.9999, kind of thing.
For streaming a file to the user, it's best to NOT use php's own "dump file to browser" functions, as they're all subject to the memory limit. It's best to do an fopen/fwrite/fclose loop and spit the file out in small manageable chunks (4k, 16k, etc...).

Categories