I have a very simple script for downloading a pdf:
$path = __DIR__.'/sample_file.pdf';
$pathinfo = pathinfo($path);
$fsize = filesize($path);
$filename = $pathinfo['basename'];
header('Content-Type: application/pdf');
header("Pragma:public");
header("Content-Disposition:attachment;filename=" . $filename);
header('Content-Length: ' . $fsize);
header("Pragma: no-cache");
#readfile($path);
exit;
This works fine for files smaller than 10mb. But anything over that I get an error and the download fails. Have tried in a number of browsers and get similar results. In all cases the download fails.
Chrome: Failed: Network error
Firefox: [filename].pdf.part could not be saved, because the source file could not be read.
Opera: Interrupted: Network error
IE (10): [filename].pdf couldn't be downloaded.
I know the filepaths are correct, or else it wouldn't work with files smaller than 5mb.
Reading the php docs (http://php.net/manual/en/function.readfile.php) on readfile some have suggested disabling output_buffering (currently set to 4096 in php.ini) before calling readfile. I have yet to try this but I'm not convinced it is the solution.
zlib.output_compression is disabled. I am not seeing any errors in my logs. I am also not seeing any errors in the network pane in Chrome inspector.
I have tried downloading with chunks but get the same result. I have researched similar answers but all seem to be browser specific, i.e. working in some browsers but not others.
Maybe you use pausing or something like this and you target file is locking. It is possible that your security software is scanning and locking the file, which may still be in the system temporary folder after you pause the download.
Related
In a PHP 7.1 app I load files from a php file "filecache.php" - it works fine in regards of returning a cachable file.
But the thing that drives me crazy is that the images triggers a Insecure content warning in Chrome and FF (perhaps others also).
The files are loaded using a tag, I tried relative url and this gives me the error. I've also tried using complete url https://example.com/filecache?f=0983490842'> same error.
The server uses HSTS and LetsEncrypt cert - no http traffic allowed.
When I examine the Network activity in Chrome (and FF) I can see that the browser tries to retrieve a file using https, finds it in the cache which gives a 307 internal redirect - but to a http url - finally ending up with the image loaded over https from the cache. Well at least thats how I read the info below.
Any input or pointers will be greatly appreciated!
filecache.php
if(file_exists($file)){
if(substr($_GET["f"],-3)=="jpg") Header("Content-Type: image/jpeg");
if(substr($_GET["f"],-3)=="png") header("Content-Type: image/png");
header('Cache-control: max-age='.(60*60*24*365));
header('Expires: '.date("Y-m-d H:i:s",strtotime("+365 days")));
header('Last-Modified: '.gmdate(DATE_RFC1123,filemtime($file)));
readfile($file);
}else{
die("no such file");
}
I'm a novice, so I'll try and do my best to explain a problem I'm having. I apologize in advance if there's something I left out or is unclear.
I'm serving an 81MB zip file outside my root directory to people who are validated beforehand. I've been getting reports of corrupted downloads or an inability to complete the download. I've verified this happening on my machine if I simulate a slow connection.
I'm on shared hosting running Apache-Coyote/1.1.
I get a network timeout error. I think my host might be doing killing the downloads if they take too long, but they haven't verified either way.
I thought I was maybe running into a memory limit or time limit, so my host installed the apache module XSendFile. My headers in the file that handles the download after validation are being set this way:
<?php
set_time_limit(0);
$file = '/absolute/path/to/myzip/myzip.zip';
header("X-Sendfile: $file");
header("Content-type: application/zip");
header('Content-Disposition: attachment; filename="' . basename($file) . '"');
Any help or suggestions would be appreciated. Thanks!
I would suggest taking a look at this comment:
http://www.php.net/manual/en/function.readfile.php#99406
Particularly, if you are using apache. If not the code in the link above should be helpful:
I started running into trouble when I had really large files being sent to clients with really slow download speeds. In those cases, the
script would time out and the download would terminate with an
incomplete file. I am dead-set against disabling script timeouts - any
time that is the solution to a programming problem, you are doing
something wrong - so I attempted to scale the timeout based on the
size of the file. That ultimately failed though because it was
impossible to predict the speed at which the end user would be
downloading the file at, so it was really just a best guess so
inevitably we still get reports of script timeouts.
Then I stumbled across a fantastic Apache module called mod_xsendfile ( https://tn123.org/mod_xsendfile/ (binaries) or
https://github.com/nmaier/mod_xsendfile (source)). This module
basically monitors the output buffer for the presence of special
headers, and when it finds them it triggers apache to send the file on
its own, almost as if the user requested the file directly. PHP
processing is halted at that point, so no timeout errors regardless of
the size of the file or the download speed of the client. And the end
client gets the full benefits of Apache sending the file, such as an
accurate file size report and download status bar.
The code I finally ended up with is too long to post here, but in general is uses the mod_xsendfile module if it is present, and if not
the script falls back to using the code I originally posted. You can
find some example code at https://gist.github.com/854168
EDIT
Just to have a reference of code that does the "chunking" Link to Original Code:
<?php
function readfile_chunked ($filename,$type='array') {
$chunk_array=array();
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
switch($type)
{
case'array':
// Returns Lines Array like file()
$lines[] = fgets($handle, $chunksize);
break;
case'string':
// Returns Lines String like file_get_contents()
$lines = fread($handle, $chunksize);
break;
}
}
fclose($handle);
return $lines;
}
?>
When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
A PHP application is offering binary data as a download:
header("Content-Type: application/octet-stream");
header("Pragma: public");
header("Cache-Control: private");
header("Content-Disposition: attachment; filename=\"$filename\"");
header("expires: 0");
set_time_limit(0);
ob_clean();
flush();
#readfile($completefilename); exit;
$completefilename is a stream like "ftp://user:pwd#..."
The size of the data can be several MByte. It works fine, but sporadically I get the following error:
It's most likely that the remote stream is occasionally down, or times out.
Also as #fab says it could be that the file you are trying to load is larger than your script's memory.
You should start logging the errors readfile() returns, e.g. using the error_log php.ini directive.
If this needs to be completely foolproof, I think you'll have to use something more refined than readfile() that allows to set a timeout (like curl, or readfile with stream context options).
You could then catch any errors that occur while downloading, and serve a locally hosted fallback document instead. That document could e.g. be a text file containing the message "Resource xyz could not be loaded".
Do you have anything in your error logs?
Maybe PHP is running out of memory because readfile() needs to pull the while file into memory. Make sure memory_limit is larger than the largest file you work on with readfile(). Another option is to output the file in chunks using fread().
i wonder if you guys come up with some awesome solutions for my problem. the normal way is not working!
Well, i'm trying to force-download any file on any website with the following php script. I just pass ?p=http://www.whatever.com/images/flowers/rose.jpg to my url and the download prompts.
<?php
error_reporting(E_ALL);
if(isset($_GET['p'])) $path = $_GET['p'];
else echo "no path set!";
$file = $path;
//header('Location:' . $file); //works perfect, opens the file in the browser
//
header("Cache-Control: no-cache");
header("Expires: -1");
header("Content-Type: application/octet-stream;");
header("Content-Disposition: attachment; filename=\"" . basename($file) . "\";");
header("Content-Transfer-Encoding: binary");
header("Content-Length: " . filesize($file));
echo readfile($file);
?>
However, as I found out today filesize() just works with local files on my server not with an http request. The same applies to readfile()
Warning: filesize() [function.filesize]: stat failed for
pathtofiles…/downloader/d.php on line
15
Warning:
readfile(sphotos.ak.fbcdn.net/hphotos-ak-snc4/hs211.snc4/…)
[function.readfile]: failed to open
stream: HTTP request failed! HTTP/1.0
403 Forbidden in
pathtofiles…/downloader/d.php on line
16
i wonder if there are creative coders out there who can help me out here. Is there any chance for me to make that script work? I just want to forcedownload whatever file and url you pass along.
thank you advance,
regards matt
filesize() and readfile() can work with some protocols if your PHP settings allow.
The problem at hand is a more fundamental one, though:
HTTP request failed! HTTP/1.0 403 Forbidden in pathtofiles....
remember that when fetching a file using PHP, it does not have the user's login permissions available. The script instaince is acting like an independent browser. If a resource is password protected, you need to make your PHP script log in first.
You cannot 'force-download' for a file that is NOT under your control (as a remote file). The 'force-download' tells the browser to download the file that is about to be transmitted. Location: some-path tells the browser to look for the new location, thus listening the new location, not your current page.
One option, but not optimal, would be to make a local copy of the file (at your server), then retrieve it to the user. If the file is big enough, this will give the impression of a frozen page. To avoid this, you can read chunks of the remote file, and deliver after each read command.
Be aware, your code does not restrict what $file can download, thus allowing users to download virtually any readable file at the server, this is a security flaw.
The best way is to store filesize in your database if you have control over remote files and then use readfile function.
From PHP manual
readfile() You can use a URL as a filename with this function if the fopen wrappers have been enabled.