How can I find out the issue when I'm going to create zip file of 2GB file.
Error
file_get_contents(): content truncated from 2147483648 to 2147483647
bytes
Fatal error: Out of memory (allocated 2151677952) (tried to allocate
18446744071562067968 bytes) in
I am using dedicated server and already set memory_limit,max_execution_time,max_upload_filesize,max_post_size. But it is not working for me.Please check my code and let me know what i am doing wrong -
create new zip object
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
# download file
$download_file = file_get_contents($file_path.'/'.$file);
#add it to the zip
$zip->addFromString(basename($file_path.'/'.$file),$download_file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
I change $zip->addFromString() to $zip->addFile() because you don't need read the content file to add the file, I test your code with 3 films and don't works (I had the same error) but when I use $zip->addFile() all go ok and I could download the zip file with 3gb.
I need to use set_time_limit(0);
If you want test this code only change the values of:
$files //Array of files name
$file_path //Path where your files ($files) are placed
$last_seg //The name of your zip file
<?php
set_time_limit(0);
$files = array('Exodus.mp4', 'the-expert.webm', 'what-virgin-means.webm');
$file_path = 'zip';
$last_seg = 'test';
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
$zip->addFile($file_path.'/'.$file, $file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
?>
You can read more at:
http://php.net/manual/en/ziparchive.addfile.php
You'll never been able to allocate more memory than PHP_INT_MAX. So maybe the linux x64 versions of PHP can handle this if the file_gets_content isn't internally limited to a signed int 32bits, but on windows or on a 32bits system you have no chance to achieve this without streaming.
Something like this might work: (not tested yet)
$fr = fopen("http://...", "r");
$fw = fopen("zip://c:\\test.zip#test", "w");
while(false !== ($buffer = fread($fr, 8192)))
{
fwrite($fw, $buffer, strlen($buffer));
}
fclose($fr);
fclose($fw);
Ok my bad apparently PHP do not provide the mode "+w" for a zip stream... Your last options will be then, writing the whole file in a temp file (by streaming it like i did, no file_get_contents) before giving it to an external program (with a system() or popen call...) or using another compression format (apparently php support write stream operation for zlib ant bzip2) or use an external library for php.
try to put this line in the beginning of your code:
ini_set("memory_limit", -1);
Refer to this question
Fatal error: Out of memory (allocated 1134559232) (tried to allocate 32768 bytes) in X:\wamp\www\xxx
Related
I'm using the PHP Flysystem package to stream content from my Amazon S3 bucket. In particular, I'm using $filesystem->readStream.
My Question
When I stream a file, it ends up in myzip.zip and the size is correct, but when unzip it, it become myzip.zip.cpgz. Here is my prototype:
header('Pragma: no-cache');
header('Content-Description: File Download');
header('Content-disposition: attachment; filename="myZip.zip"');
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
$s3 = Storage::disk('s3'); // Laravel Syntax
echo $s3->readStream('directory/file.jpg');
What am I doing wrong?
Side Question
When I stream a file like this, does it:
get fully downloaded into my server's RAM, then get transferred to the client, or
does it get saved - in chunks - in the buffer, and then get transferred to the client?
Basically, is my server being burdened if I have have dozens of GB's of data being streamed?
You are currently dumping the raw contents of the directory/file.jpg as the zip (which a jpg is not a zip) . You need to create a zip file with those contents.
Instead of
echo $s3->readStream('directory/file.jpg');
Try the following in its place using the Zip extension:
// use a temporary file to store the Zip file
$zipFile = tmpfile();
$zipPath = stream_get_meta_data($zipFile)['uri'];
$jpgFile = tmpfile();
$jpgPath = stream_get_meta_data($jpgFile)['uri'];
// Download the file to disk
stream_copy_to_stream($s3->readStream('directory/file.jpg'), $jpgFile);
// Create the zip file with the file and its contents
$zip = new ZipArchive();
$zip->open($zipPath);
$zip->addFile($jpgPath, 'file.jpg');
$zip->close();
// export the contents of the zip
readfile($zipPath);
Using tmpfile and stream_copy_to_stream, it will download it in chunks to a temporary file on disk and not into RAM
I have a large csv file (100M+) generated and uploaded it to AWS S3. The file is compressed. Now I need to download the file to users, and it ran into out of memory problem.
Please see below my code:
header('Content-Type: text/csv; charset=utf-8');
header("Content-Disposition: attachment; filename=example.csv");
$result = $this->loadFromS3(); //This returns the file from AWS S3
echo bzdecompress($result['Body']);
How can I adjust my code to avoid memory exhausted?
Simply, dont store in $result, stream the file. Use the Amazon S3 Stream Wrapper and
$client->registerStreamWrapper();
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/test.bz2', 'r')) {
stream_filter_append($stream, 'bzip2.decompress',STREAM_FILTER_READ);
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}
I have a Amazon S3 bucket with files and want to generate a force download zip-file on Amazon EC2 (With Amazon Linux, similar to CentOS/Redhat Linux).
I have all signed urls in an (post) array and right now I'am using this code.
<?php
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
ini_set('memory_limit', '8192M');
$imageNumber = 0;
# create new zip opbject
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
foreach ($_POST['text'] as $key => $value) {
//Get extension
$ext = pathinfo(parse_url($value, PHP_URL_PATH), PATHINFO_EXTENSION);
# download file
$download_file = file_get_contents($value);
#add it to the zip
$zip->addFromString(basename($imageNumber.'-image.'.$ext),$download_file);
$imageNumber++;
}
# close zip
$zip->close();
# send the file to the browser as a download
header('Content-disposition: attachment; filename=download.zip');
header('Content-type: application/zip');
readfile($tmp_file);
?>
My goal is to keep the costs on EC2 as low as possible and right now I just have 4GB on my instance. As you can see, I have increased the memory limit and the problem is that I have 50-100 files of about 5 MB each, which is a few hundred MB per Zip file which of course is a problem if several people generates a zip at the same time.
Is there a better solution (less memory use and as low cost as possible on EC2)?
What I'm trying to do is iterate through a list of links and zip them all into the same directory. The files include images, PDFs, audio, and video. The issue is that some of the files are large and I get the following error:
FatalErrorException: Error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 46512453 bytes)...
Here's what I have right now:
$zip = new ZipArchive();
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
foreach($mediaFiles as $file){
$downloadFile = file_get_contents($file);
$zip->addFromString(basename($file), $downloadFile);
}
$zip->close();
header('Content-disposition: attachment; filename=download.zip');
header('Content-type: application/zip');
readfile($tmp_file);
check this library, it allows creating zip files and returns them as a stream:
http://www.phpclasses.org/package/6110-PHP-Create-archives-of-compressed-files-in-ZIP-format.html
I'm working on project that should create link to download xpi file.
T his is my php code to download file:
$fileName = 'file.xpi';
$size = filesize($fileName);
$fp = fopen($fileName, "rb");
$content = fread($fp, $size);
fclose($fp);
header("Content-length: ".$size);
header("Content-type: application/x-xpinstall");
header("Content-disposition: attachment; filename=".$fileName.";" );
echo $content;
This code work's fine, but the downloaded file is my problem.. when i run, Firefox show this error message:
This add-on could not be installed because it appears to be corrupt.
Both files (that one on server and that one downloaded) have the same information.
file on server
Sive : 14.7 MB (15,509,809 bytes)
Sive on Disk : 14.7 MB (15,511,552 bytes)
download file
Sive : 14.7 MB (15,509,809 bytes)
Sive on Disk : 14.7 MB (15,511,552 bytes)
What should I do?
Open downloaded file in some kind of text editor like Notepad and check if there are not any plain text PHP-related errors at the beginning of the file.