I try to render a zip file in php.
Code:
header('Content-Type: application/zip');
header('Content-Length: ' . filesize($file));
header('Content-Disposition: attachment; filename="file.zip"');
The downloaded file, is only few bytes. It is an error message:
<br />
<b>Fatal error</b>: Allowed memory size of 16777216 bytes exhausted (tried to allocate 41908867 bytes) in <b>/var/www/common_index/main.php</b> on line <b>217</b><br />
I do not wish to increase memory_limit in php.ini. What are alternative ways to properly render large zip files without tinkering with global settings?
Stream the download, so it doesn't choke on memory.
Tiny example:
$handle = fopen("exampe.zip", "rb");
while (!feof($handle)) {
echo fread($handle, 1024);
flush();
}
fclose($handle);
Add correct output headers for downloading, and you should solve the problem.
PHP actually provides an easy method to output a binary file directly to Apache without stashing it in memory first via the readfile() function:
header('Content-Type: application/zip');
header('Content-Length: ' . filesize($file));
header('Content-Disposition: attachment; filename="file.zip"');
readfile('file.zip');
Related
I write some small script and it is working ok! however when i try download large file such a more than 1GB Download is stopping around ~880MB
Am i doing something wrong? or is there any better solution for download big file with PHP
This is my code
<?php
set_time_limit(0);
ini_set('memory_limit', '5000M');
//This File is 4GB size
$url = 'example.com/files/file_name.zip';
$headers = get_headers($url, TRUE); //collectind header information from file Url
$filesize = $headers['Content-Length']; //File size
while (ob_get_level()) ob_end_clean();
header('Content-Type: application/zip');
header('Content-Disposition: filename=' . basename($url));
header('Content-length: ' . $filesize);
header('Cache-Control: no-cache');
header("Content-Transfer-Encoding: chunked");
ob_flush();
flush();
readfile($url);
exit;
I suspect that you're hitting a memory limit error. Normally readfile avoids using much memory, but depending on your settings it can consume RAM in the process. Check your error logs to confirm that you're seeing a memory allocation error, and if you are seeing it, try getting rid of the while and ob_flush and flush and just add something like this after the headers:
if (ob_get_level()) {
ob_end_clean();
}
readfile($url);
Also, don't pump your memory limit up to 5GB... if this is working correctly you shouldn't require hardly any memory.
If that's not it, can you confirm that you're serving a remotely hosted ZIP file (i.e. you don't have direct access to the ZIP file on the current server)?
I'm using the following code to force download some mp3 files that are stored on my server. This works fine but it takes over 1 minute to download 1 mp3 file, even for a file that is 2.5MB. Something seems wrong for it take that long. Any ideas what I can do to make this download a lot faster?
$fullfilename=$_GET['file'];
$filename=basename($fullfilename);
header('Content-type: audio/mpeg');
header("Content-Disposition: attachment; filename=\"{$filename}\"");
header("Content-Length: " . filesize($filename));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($fullfilename);
exit;
It depends on the internet connection between the server and your browser. PHP cannot do anything about it.
I'm using this script:
http://www.webvamp.co.uk/blog/coding/creating-one-time-download-links/
to allow users download files (one time). Everything works fine with small files. Now i'm trying to do the same but with larger file 1.2 GB. Instead of forcing user to download file, script show off the relative patch to the file! Is there any way to modify the script or its a fault of the server configuration?
Thanks for help!
Looking for the code i think it fails on large files due to memory limitation. Script reads the whole file in memory via file_get_contents() before sending it. I suspect, >1Gb files will cause the problems with memory.
Try to replace following lines in download.php script:
//get the file content
$strFile = file_get_contents($strDownload);
//set the headers to force a download
header("Content-type: application/force-download");
header("Content-Disposition: attachment;
filename=\"".str_replace(" ", "_", $arrCheck['file'])."\"");
//echo the file to the user
echo $strFile;
to:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($strDownload));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($strDownload));
ob_clean();
flush();
readfile($strDownload);
This may be help.
Note from manual: readfile() will not present any memory issues, even when sending large files, on its own
I have used following code to generate zip
// push to download the zip
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="'.$zip_name.'"');
readfile($zip_name);
this code works fine but for unknown reasons was not working until I tried
// push to download the zip
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="'.$zip_name.'"');
echo file_get_contents($zip_name);
I am curious about finding what is happening in both the cases
Readfile will read the file directly into the output buffer, and file_get_contents will load the file into memory, when you echo the result the data is copied from memory to the output buffer effectively using 2 times the memory of readfile.
I have this code:
$file = $tempDir . "/download.zip";
// there's some omitted code here that creates the file that's to be downloaded
if(file_exists($file) && is_readable($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
}else{
return "Error: Failed to retrieve file.";
}
The code that generates the file works fine, and after hitting the button for downloading, I see it appear in its appropriate place, at 1 KB. The file is also usable. When I download, it even says it's "973 bytes". When the file actually downloads though, it's suddenly 9.1 KB, and completely corrupted. Why?
does the code omitted run a background program? If you run a command line program (exec etc) to create the zip file you may be serving an incomplete file.
upload a zip file that you know works, remove the code to create a zip and see if that is served correctly. if it isn't let me know and i'll take a look.
is this windows or linux?
also the size of the file may vary slightly. windows adds meta data, created,added,modified etc that can effect the file size