I write some small script and it is working ok! however when i try download large file such a more than 1GB Download is stopping around ~880MB
Am i doing something wrong? or is there any better solution for download big file with PHP
This is my code
<?php
set_time_limit(0);
ini_set('memory_limit', '5000M');
//This File is 4GB size
$url = 'example.com/files/file_name.zip';
$headers = get_headers($url, TRUE); //collectind header information from file Url
$filesize = $headers['Content-Length']; //File size
while (ob_get_level()) ob_end_clean();
header('Content-Type: application/zip');
header('Content-Disposition: filename=' . basename($url));
header('Content-length: ' . $filesize);
header('Cache-Control: no-cache');
header("Content-Transfer-Encoding: chunked");
ob_flush();
flush();
readfile($url);
exit;
I suspect that you're hitting a memory limit error. Normally readfile avoids using much memory, but depending on your settings it can consume RAM in the process. Check your error logs to confirm that you're seeing a memory allocation error, and if you are seeing it, try getting rid of the while and ob_flush and flush and just add something like this after the headers:
if (ob_get_level()) {
ob_end_clean();
}
readfile($url);
Also, don't pump your memory limit up to 5GB... if this is working correctly you shouldn't require hardly any memory.
If that's not it, can you confirm that you're serving a remotely hosted ZIP file (i.e. you don't have direct access to the ZIP file on the current server)?
I'm using the following code to force download some mp3 files that are stored on my server. This works fine but it takes over 1 minute to download 1 mp3 file, even for a file that is 2.5MB. Something seems wrong for it take that long. Any ideas what I can do to make this download a lot faster?
$fullfilename=$_GET['file'];
$filename=basename($fullfilename);
header('Content-type: audio/mpeg');
header("Content-Disposition: attachment; filename=\"{$filename}\"");
header("Content-Length: " . filesize($filename));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($fullfilename);
exit;
It depends on the internet connection between the server and your browser. PHP cannot do anything about it.
I'm using jQuery with PHP.
I've written a simple download function with PHP:
function downloadFile($sFile){
#Main function
header('Content-Type: '.mime_content_type($sFile));
header('Content-Description: File Transfer');
header('Content-Length: ' . filesize($sFile));
header('Content-Disposition: attachment; filename="' . basename($sFile) . '"');
readfile($sFile);
}
I can download a file through this script, but if it's a large files(like 1GB), the readfile function needs his time until the download start. So i have to wait about a minute or something, until the download really starts.
Any idea how to optimze my script, so the download starts immediately?
You could configure Apache to set the proper headers in your .htaccess file. Then, you could link directly to the file instead of the PHP page. This will also reduce server load.
Of course, if the PHP script performs functions other than just setting headers (such as authentication) then this is not an option. You will have to pass the file through PHP in chunks as #N.B. mentions in his comment.
When using readfile() -- using PHP on Apache -- is the file immediately read into Apache's output buffer and the PHP script execution completed, or does the PHP script execution wait until the client finishes downloading the file (or the server times out, whichever happens first)?
The longer back-story:
I have a website with lots of large mp3 files (sermons for a local church). Not all files in the audio archive are allowed to be downloaded, so the /sermon/{filename}.mp3 path is rewritten to really execute /sermon.php?filename={filename} and if the file is allowed to be downloaded then the content type is set to "audio/mpeg" and the file streamed out using readfile(). I've been getting complaints (almost exclusively from iPhone users who are streaming the downloads over 3G) that the files don't fully download, or that they cut off after about 10 or 15 minutes. When I switched from streaming out the file with a readfile() to simply redirecting to the file -- header("Location: $file_url"); -- all of the complaints went away (I even checked with a few users who could reliably reproduce the problem on demand previously).
This leads me to suspect that when using readfile() the PHP script engine is in use until the file is fully downloaded but I cannot find any references which confirm or deny this theory. I'll admit I'm more at home in the ASP.NET world and the dotNet equivalent of readfile() pushes the whole file to the IIS output buffer immediately so the ASP.NET execution pipeline can complete independently of the delivery of the file to the end client... is there an equivalent to this behavior with PHP+Apache?
You may still have PHP output buffering active while performing the readfile(). Check that with:
if (ob_get_level()) ob_end_clean();
or
while (ob_get_level()) ob_end_clean();
This way theonly remaining output Buffer should be apache's Output Buffer, see SendBufferSize for apache tweaks.
EDIT
You can also have a look at mod_xsendfile (an SO post on such usage, PHP + apache + x-sendfile), so that you simply tell the web server you have done the security check and that now he can deliver the file.
a few things you can do (I am not reporting all the headers that you need to send that are probably the same ones that you currently have in your script):
set_time_limit(0); //as already mention
readfile($filename);
exit(0);
or
passthru('/bin/cat '.$filename);
exit(0);
or
//When you enable mod_xsendfile in Apache
header("X-Sendfile: $filename");
or
//mainly to use for remove files
$handle = fopen($filename, "rb");
echo stream_get_contents($handle);
fclose($handle);
or
$handle = fopen($filename, "rb");
while (!feof($handle)){
//I would suggest to do some checking
//to see if the user is still downloading or if they closed the connection
echo fread($handle, 8192);
}
fclose($handle);
The script will be running until the user finishes downloading the file. The simplest, most efficient and surely working solution is to redirect the user:
header("Location: /real/path/to/file");
exit;
But this may reveal the location of the files. It's a good idea to password-protect the files that may not be downloaded by everyone anyway with an .htaccess file, but perhaps you use a database to detemine access and this is no option.
Another possible solution is setting the maximum execution time of PHP to 0, which disables the limit:
set_time_limit(0);
Your host may disallow this, though. Also PHP reads the file into the memory first, then goes through Apache's output buffer, and finally makes it to the network. Making users download the file directly is much more efficient, and does not have PHP's limitations like the maximum execution time.
Edit: The reason you get this complaint a lot from iPhone users is probably that they have a slower connection (e.g. 3G).
downloading files thru php isnt very efficient, using a redirect is the way to go. If you dont want to expose the location of the file, or the file isnt in a public location then look into internal redirects, here is a post that talks about it a bit, Can I tell Apache to do an internal redirect from PHP?
Try using stream_copy_to_stream() instead. I find is has fewer problems than readfile().
set_time_limit(0);
$stdout = fopen('php://output', 'w');
$bfname = basename($fname);
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$bfname\"");
$filein = fopen($fname, 'r');
stream_copy_to_stream($filein, $stdout);
fclose($filein);
fclose($stdout);
Under Apache, there is a nice elgant solution not involving php at all:
Just place an .htaccess config file into the folder containing the files to be offered for download with the following contents:
<Files *.*>
ForceType applicaton/octet-stream
</Files>
This tells the Apache to offer all files in this folder (and all its subfolders) for download, instead of directly displaying them in the browser.
See below url
http://php.net/manual/en/function.readfile.php
<?php
$file = 'monkey.gif';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
I am currently trying to develop a PHP application in which my server downloads a file and the user can do the same almost simultaneously. I already think about the problem "If the user downloads fastly than the server...", but it's not a problem at this moment.
To do so, I used the header and readfile functions of php. Here is my code :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
I must to use the Content-length header to set the proper size of the file and not the size that is downloaded when the user clicks on the link. However, after some seconds or minutes, download is stopped and I need to restart...
If you think about a solution, even if it didn't use the header(); function, please tell me.
Thank you in advance...
I have experienced that this is directly related to maximum runtime settings, that are enforced upon you if you run with safe_mode on.
If you have the option, try setting set_time_limit(0) and see if that makes it work.
if you have your own server, you should look into the mod_xsendfile module for apache, since that is built specifically to send large files to the user.
Oh, and its stupidly easy to use
header("X-Sendfile: $path_to_somefile");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$somefile\"");
exit;