I'm working on project that should create link to download xpi file.
T his is my php code to download file:
$fileName = 'file.xpi';
$size = filesize($fileName);
$fp = fopen($fileName, "rb");
$content = fread($fp, $size);
fclose($fp);
header("Content-length: ".$size);
header("Content-type: application/x-xpinstall");
header("Content-disposition: attachment; filename=".$fileName.";" );
echo $content;
This code work's fine, but the downloaded file is my problem.. when i run, Firefox show this error message:
This add-on could not be installed because it appears to be corrupt.
Both files (that one on server and that one downloaded) have the same information.
file on server
Sive : 14.7 MB (15,509,809 bytes)
Sive on Disk : 14.7 MB (15,511,552 bytes)
download file
Sive : 14.7 MB (15,509,809 bytes)
Sive on Disk : 14.7 MB (15,511,552 bytes)
What should I do?
Open downloaded file in some kind of text editor like Notepad and check if there are not any plain text PHP-related errors at the beginning of the file.
Related
I have a large csv file (100M+) generated and uploaded it to AWS S3. The file is compressed. Now I need to download the file to users, and it ran into out of memory problem.
Please see below my code:
header('Content-Type: text/csv; charset=utf-8');
header("Content-Disposition: attachment; filename=example.csv");
$result = $this->loadFromS3(); //This returns the file from AWS S3
echo bzdecompress($result['Body']);
How can I adjust my code to avoid memory exhausted?
Simply, dont store in $result, stream the file. Use the Amazon S3 Stream Wrapper and
$client->registerStreamWrapper();
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/test.bz2', 'r')) {
stream_filter_append($stream, 'bzip2.decompress',STREAM_FILTER_READ);
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}
What I'm trying to do is iterate through a list of links and zip them all into the same directory. The files include images, PDFs, audio, and video. The issue is that some of the files are large and I get the following error:
FatalErrorException: Error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 46512453 bytes)...
Here's what I have right now:
$zip = new ZipArchive();
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
foreach($mediaFiles as $file){
$downloadFile = file_get_contents($file);
$zip->addFromString(basename($file), $downloadFile);
}
$zip->close();
header('Content-disposition: attachment; filename=download.zip');
header('Content-type: application/zip');
readfile($tmp_file);
check this library, it allows creating zip files and returns them as a stream:
http://www.phpclasses.org/package/6110-PHP-Create-archives-of-compressed-files-in-ZIP-format.html
How can I find out the issue when I'm going to create zip file of 2GB file.
Error
file_get_contents(): content truncated from 2147483648 to 2147483647
bytes
Fatal error: Out of memory (allocated 2151677952) (tried to allocate
18446744071562067968 bytes) in
I am using dedicated server and already set memory_limit,max_execution_time,max_upload_filesize,max_post_size. But it is not working for me.Please check my code and let me know what i am doing wrong -
create new zip object
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
# download file
$download_file = file_get_contents($file_path.'/'.$file);
#add it to the zip
$zip->addFromString(basename($file_path.'/'.$file),$download_file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
I change $zip->addFromString() to $zip->addFile() because you don't need read the content file to add the file, I test your code with 3 films and don't works (I had the same error) but when I use $zip->addFile() all go ok and I could download the zip file with 3gb.
I need to use set_time_limit(0);
If you want test this code only change the values of:
$files //Array of files name
$file_path //Path where your files ($files) are placed
$last_seg //The name of your zip file
<?php
set_time_limit(0);
$files = array('Exodus.mp4', 'the-expert.webm', 'what-virgin-means.webm');
$file_path = 'zip';
$last_seg = 'test';
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
$zip->addFile($file_path.'/'.$file, $file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
?>
You can read more at:
http://php.net/manual/en/ziparchive.addfile.php
You'll never been able to allocate more memory than PHP_INT_MAX. So maybe the linux x64 versions of PHP can handle this if the file_gets_content isn't internally limited to a signed int 32bits, but on windows or on a 32bits system you have no chance to achieve this without streaming.
Something like this might work: (not tested yet)
$fr = fopen("http://...", "r");
$fw = fopen("zip://c:\\test.zip#test", "w");
while(false !== ($buffer = fread($fr, 8192)))
{
fwrite($fw, $buffer, strlen($buffer));
}
fclose($fr);
fclose($fw);
Ok my bad apparently PHP do not provide the mode "+w" for a zip stream... Your last options will be then, writing the whole file in a temp file (by streaming it like i did, no file_get_contents) before giving it to an external program (with a system() or popen call...) or using another compression format (apparently php support write stream operation for zlib ant bzip2) or use an external library for php.
try to put this line in the beginning of your code:
ini_set("memory_limit", -1);
Refer to this question
Fatal error: Out of memory (allocated 1134559232) (tried to allocate 32768 bytes) in X:\wamp\www\xxx
It seems I successfully create an on-the-fly zip archive (filesize and file_exists both return the expected values) but when I attempt to actually download it, I receive an empty ZIP file.
Curiously enough this error occurs with both readfile and fread. This is my code
$filename = $zip;
$handle = fopen($filename, 'r');
if ($handle === false)
{
die('Could not read file "' . $filename . '"');
}
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="fsdownload.zip"');
header('Cache-Control: private');
while (!feof($handle))
{
echo fread($handle, 8192);
}
fclose($handle);
This works fine for zip-Files < 10 MB. Any thoughts on what the problem might be?
To avoid consuming too much memory, you can use ZipStream or PHPZip, which will send zipped files on the fly to the browser, divided in chunks, instead of loading the entire content in PHP and then sending the zip file.
Both libraries are nice and useful pieces of code. A few details:
ZipStream "works" only with memory, but cannot be easily ported to PHP 4 if necessary (uses hash_file())
PHPZip writes temporary files on disk (consumes as much disk space as the biggest file to add in the zip), but can be easily adapted for PHP 4 if necessary.
Related SO questions:
Generating ZIP files with PHP + Apache on-the-fly in high speed?
Create a zip file using PHP class ZipArchive without writing the file to disk?
The problem is a PHP limit. Check for memory and execution time limits.
I try to render a zip file in php.
Code:
header('Content-Type: application/zip');
header('Content-Length: ' . filesize($file));
header('Content-Disposition: attachment; filename="file.zip"');
The downloaded file, is only few bytes. It is an error message:
<br />
<b>Fatal error</b>: Allowed memory size of 16777216 bytes exhausted (tried to allocate 41908867 bytes) in <b>/var/www/common_index/main.php</b> on line <b>217</b><br />
I do not wish to increase memory_limit in php.ini. What are alternative ways to properly render large zip files without tinkering with global settings?
Stream the download, so it doesn't choke on memory.
Tiny example:
$handle = fopen("exampe.zip", "rb");
while (!feof($handle)) {
echo fread($handle, 1024);
flush();
}
fclose($handle);
Add correct output headers for downloading, and you should solve the problem.
PHP actually provides an easy method to output a binary file directly to Apache without stashing it in memory first via the readfile() function:
header('Content-Type: application/zip');
header('Content-Length: ' . filesize($file));
header('Content-Disposition: attachment; filename="file.zip"');
readfile('file.zip');