A PHP application is offering binary data as a download:
header("Content-Type: application/octet-stream");
header("Pragma: public");
header("Cache-Control: private");
header("Content-Disposition: attachment; filename=\"$filename\"");
header("expires: 0");
set_time_limit(0);
ob_clean();
flush();
#readfile($completefilename); exit;
$completefilename is a stream like "ftp://user:pwd#..."
The size of the data can be several MByte. It works fine, but sporadically I get the following error:
It's most likely that the remote stream is occasionally down, or times out.
Also as #fab says it could be that the file you are trying to load is larger than your script's memory.
You should start logging the errors readfile() returns, e.g. using the error_log php.ini directive.
If this needs to be completely foolproof, I think you'll have to use something more refined than readfile() that allows to set a timeout (like curl, or readfile with stream context options).
You could then catch any errors that occur while downloading, and serve a locally hosted fallback document instead. That document could e.g. be a text file containing the message "Resource xyz could not be loaded".
Do you have anything in your error logs?
Maybe PHP is running out of memory because readfile() needs to pull the while file into memory. Make sure memory_limit is larger than the largest file you work on with readfile(). Another option is to output the file in chunks using fread().
Related
So my company store all PDF files in Amazon S3 privately.
When the user request it our system pull it from Amazon S3 and then serve it to the user with following code:
header("Cache-Control: public");
header("Pragma: public");
header("Expires: 0");
header("Content-Description: File Transfer");
header('Content-Disposition: attachment; filename="'.$fileName.'"');
header('Content-Length: ' . strlen($res->body));
header("Content-type: application/pdf");
header("Content-Transfer-Encoding: binary");
header('Connection: close');
echo $res->body;
$res is the respond returned from Amazon with the content from $res->body;
I see random slow download speed when the user try to download the PDF files, especially when the PDF is large (~5mb) compare to the rest that only having 800kb-1.5mb.
Solution tried:
1) Removing the content-length header doesn't help.
2) Remove EnableSendfile off in httpd.conf doesn't help either.
I also checked the server to make sure it wasn't the workload of the server that's causing this.
The speed test of both the server and user's workstation looks good too.
Do anyone of you have any idea what is the reason that's causing this slowness?
It might be that multiple people from your company use the internet.
You should do a full scan of the network, but still even then your not alone on the internet, and their "virtual" server environments are shared as well.
from my understanding the issue is that simply takes time to fetch the file from S3 in order to return it to the user.
Make yourself a favor:
create a signed url for a short period time
redirect the user to such url
dont worry, as creating signed urls doesn't expose any private information that compromises your security (if you do ir correctly)
At the current time I'm using this code to output a file to a user.
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$or.'');
readfile($file);
The code, however doesn't tell the browser how large the file is. And it can't output large files like 1 gb. I want the code to tell the browser the actual size of the file and be able to output large files
For large files, you need to use chunked transfer. In most cases the underlying web server (Apach/Nginx/WHY) will have facilities to do that. I recommend you use them.
That way your code does not take up a worker thread for ages, and the run-away timer will not cut in in the middle of your down-load (which would upset your users).
btw - You are talking about file download, not upload - that would be user to server. Your tag is wrong.
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
header("Content-Length: $value");
When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;
I write a script to force download mp3 files from a site. The code is working very fine but the problem is that it can't download large files. I tried it with a file of 9.21mb and it downloaded correctly, but whenever i try to use the code to download a file of 25mb, it simply gives me a cannot find server page or The website cannot display the page. So i now know it has problems downloading large files. Below is the code snippet that does the downloading of files.
header("Pragma: public");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false);
header("Content-type: application/force-download");
header("Content-Disposition: attachment; filename=\"".$dname.".mp3\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($secretfile));
$downloaded=readfile($secretfile);
The displayed error is: HTTP 500 Internal Server Error
thank u very much for ur time guys.
It could be memory limits, but usually PHP will output an error saying that the memory limit has been reached.
Also, before all of that you should disable output compression if it's enabled:
if(ini_get('zlib.output_compression')) {
ini_set('zlib.output_compression', 'Off');
}
Sometimes IE can screw up if output compression is enabled.
Watch your PHP configuration for memory limits and timeouts
In php.ini :
memory_limit = 32M
max_execution_time = 300
Note that if you want to go really high in execution time you also need to change your web server timeout.
i simply gives me a cannot find server page or The website cannot display the page
Is this the error as displayed by Internet Explorer? Do you get any server-side errors? Did you check your server logs?
Try this:
// empty output buffer
while (ob_get_level()) {
ob_end_clean();
}
if (ini_get('output_buffering')) {
ini_get('output_buffering', false);
}
// function to encode quoted-string tokens
function rfc2822_quoteString($string) {
return '"'.preg_replace('/[^\x00-\x0C\x0E-\x21\x23-\x5B\x5D-\x7F]/', '\\\$0', $string).'"';
}
// HTTP headers
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.rfc2822_quoteString($dname.'.mp3'));
header('Content-Length: '.filesize($secretfile));
// send file
readfile($secretfile);
exit;