I'm using the PHP Flysystem package to stream content from my Amazon S3 bucket. In particular, I'm using $filesystem->readStream.
My Question
When I stream a file, it ends up in myzip.zip and the size is correct, but when unzip it, it become myzip.zip.cpgz. Here is my prototype:
header('Pragma: no-cache');
header('Content-Description: File Download');
header('Content-disposition: attachment; filename="myZip.zip"');
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
$s3 = Storage::disk('s3'); // Laravel Syntax
echo $s3->readStream('directory/file.jpg');
What am I doing wrong?
Side Question
When I stream a file like this, does it:
get fully downloaded into my server's RAM, then get transferred to the client, or
does it get saved - in chunks - in the buffer, and then get transferred to the client?
Basically, is my server being burdened if I have have dozens of GB's of data being streamed?
You are currently dumping the raw contents of the directory/file.jpg as the zip (which a jpg is not a zip) . You need to create a zip file with those contents.
Instead of
echo $s3->readStream('directory/file.jpg');
Try the following in its place using the Zip extension:
// use a temporary file to store the Zip file
$zipFile = tmpfile();
$zipPath = stream_get_meta_data($zipFile)['uri'];
$jpgFile = tmpfile();
$jpgPath = stream_get_meta_data($jpgFile)['uri'];
// Download the file to disk
stream_copy_to_stream($s3->readStream('directory/file.jpg'), $jpgFile);
// Create the zip file with the file and its contents
$zip = new ZipArchive();
$zip->open($zipPath);
$zip->addFile($jpgPath, 'file.jpg');
$zip->close();
// export the contents of the zip
readfile($zipPath);
Using tmpfile and stream_copy_to_stream, it will download it in chunks to a temporary file on disk and not into RAM
Related
I am trying to create a zip file and download that zip file on a button click in WordPress but when I try to download the file it always throws error -
Cannot modify header information - headers already sent by
Here is my code -
function downloadZipFile($file_array){
# create new zip object
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.', '');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach ($file_array as $file) {
//echo $file;
# download file
$download_file = file_get_contents($file);
#add it to the zip
$zip->addFromString(basename($file), $download_file);
}
# close zip
$zip->close();
# send the file to the browser as a download
echo ABSPATH.'downloadZip.php';
header('Content-disposition: attachment; filename="my file.zip"');
header('Content-type: application/zip');
readfile($tmp_file);
unlink($tmp_file);
}
In the above function I am passing a file array which is coming from box api. I can see all files are coming fine from box api endpoint.
How can I find out the issue when I'm going to create zip file of 2GB file.
Error
file_get_contents(): content truncated from 2147483648 to 2147483647
bytes
Fatal error: Out of memory (allocated 2151677952) (tried to allocate
18446744071562067968 bytes) in
I am using dedicated server and already set memory_limit,max_execution_time,max_upload_filesize,max_post_size. But it is not working for me.Please check my code and let me know what i am doing wrong -
create new zip object
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
# download file
$download_file = file_get_contents($file_path.'/'.$file);
#add it to the zip
$zip->addFromString(basename($file_path.'/'.$file),$download_file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
I change $zip->addFromString() to $zip->addFile() because you don't need read the content file to add the file, I test your code with 3 films and don't works (I had the same error) but when I use $zip->addFile() all go ok and I could download the zip file with 3gb.
I need to use set_time_limit(0);
If you want test this code only change the values of:
$files //Array of files name
$file_path //Path where your files ($files) are placed
$last_seg //The name of your zip file
<?php
set_time_limit(0);
$files = array('Exodus.mp4', 'the-expert.webm', 'what-virgin-means.webm');
$file_path = 'zip';
$last_seg = 'test';
$zip = new ZipArchive();
# create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
# loop through each file
foreach($files as $file){
$zip->addFile($file_path.'/'.$file, $file);
}
# close zip
$zip->close();
$zip_name = $last_seg.'.zip';
# send the file to the browser as a download
header("Content-disposition: attachment; filename=$zip_name");
header('Content-type: application/zip');
readfile($tmp_file);
?>
You can read more at:
http://php.net/manual/en/ziparchive.addfile.php
You'll never been able to allocate more memory than PHP_INT_MAX. So maybe the linux x64 versions of PHP can handle this if the file_gets_content isn't internally limited to a signed int 32bits, but on windows or on a 32bits system you have no chance to achieve this without streaming.
Something like this might work: (not tested yet)
$fr = fopen("http://...", "r");
$fw = fopen("zip://c:\\test.zip#test", "w");
while(false !== ($buffer = fread($fr, 8192)))
{
fwrite($fw, $buffer, strlen($buffer));
}
fclose($fr);
fclose($fw);
Ok my bad apparently PHP do not provide the mode "+w" for a zip stream... Your last options will be then, writing the whole file in a temp file (by streaming it like i did, no file_get_contents) before giving it to an external program (with a system() or popen call...) or using another compression format (apparently php support write stream operation for zlib ant bzip2) or use an external library for php.
try to put this line in the beginning of your code:
ini_set("memory_limit", -1);
Refer to this question
Fatal error: Out of memory (allocated 1134559232) (tried to allocate 32768 bytes) in X:\wamp\www\xxx
I am trying to download an android APK file using php in my PC browser using Chrome.
My app is located in a particular path in the server. If I manually FTP the file from server and transfer to my android mobile it installed perfect. But when I downloaded using PHP and transfer the downloaded file to Mobile and while installing it throws 'there was a problem while parsing the package'.
Here is my PHP code I use to download
header('Content-Type: application/vnd.android.package-archive');
header('Content-Disposition: attachment; filename="' . $file_name . '"');
readfile($file_path);
return true;
fyi...
$file_name is the apk file file 'myfile.apk'
$file_path is the full absolute path of the file in server ('d:\abcd\xyz\xampp\htdocs\apkstore\myfile.apk')
I found one observation while trying to open the APK file using 7-zip.
When I open the file using 7-zip it throws an error 'Cannot open the file xxx as archive'
After I added the below PHP code
header("Content-length: " . filesize($file_path));
Now when I open the file using 7-zip it opens the file but the size of the downloaded file is greater than the original file. And when I open this file in mobile the same error 'there was a problem while parsing the package'
To cut my long story short, I am trying to download an APK file from server to localhost using PHP and I'm able to make it work.
I managed to make it work by adding ob_end_flush()
header('Content-Type: application/vnd.android.package-archive');
header("Content-length: " . filesize($file_path));
header('Content-Disposition: attachment; filename="' . $file_name . '"');
ob_end_flush();
readfile($file_path);
return true;
I'm in the middle of developing a Safari extension for imageboard-type websites and one of the bigger features I'm hoping to implement is the ability to download all of the images (the posted ones, not the global page-level images) that had been posted.
There are similar questions here already, but mine differs a bit in that the images in question are hosted on an entirely different server. I've been brainstorming a bit and figured that gathering all of the image URLs in a JS array then sending it to my server to be turned into a zip file (forcing the download, not just a link to the file) would be the best way to go. I also want the zip to be deleted after the user downloads it.
I've already finished the majority of the extension features but this one is stumping me. Any help would be greatly appreciated.
How would I'd go about doing this?
You want a extension to contact your server for downloads? That's a terrible idea! Make the zipfile locally - it's not regular javascript, it's an extension - you have full access.
Anyway assuming you want to do this anyway, what is the trouble you are having? You get a list of urls, send them to your server, your server downloads them, zips them and send them to the user. (The "your server downloads them" part should worry you!)
What problem are you having?
You can use PHP's ZipArchive class to make a ZIP, then stream it to the browser.
<?php
// Create temp zip file
$zip = new ZipArchive;
$temp = tempnam(sys_get_temp_dir(), 'zip');
$zip->open($temp);
// Add files
$zip->addFromString('file.jpg', file_get_contents('http://path/to/file.jpg'));
$zip->addFile('/this/is/my/file.txt');
// Write temp file
$zip->close();
// Stream file to browser
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=myFile.zip');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($temp));
readfile($temp);
unlink($temp);
exit;
It seems I successfully create an on-the-fly zip archive (filesize and file_exists both return the expected values) but when I attempt to actually download it, I receive an empty ZIP file.
Curiously enough this error occurs with both readfile and fread. This is my code
$filename = $zip;
$handle = fopen($filename, 'r');
if ($handle === false)
{
die('Could not read file "' . $filename . '"');
}
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="fsdownload.zip"');
header('Cache-Control: private');
while (!feof($handle))
{
echo fread($handle, 8192);
}
fclose($handle);
This works fine for zip-Files < 10 MB. Any thoughts on what the problem might be?
To avoid consuming too much memory, you can use ZipStream or PHPZip, which will send zipped files on the fly to the browser, divided in chunks, instead of loading the entire content in PHP and then sending the zip file.
Both libraries are nice and useful pieces of code. A few details:
ZipStream "works" only with memory, but cannot be easily ported to PHP 4 if necessary (uses hash_file())
PHPZip writes temporary files on disk (consumes as much disk space as the biggest file to add in the zip), but can be easily adapted for PHP 4 if necessary.
Related SO questions:
Generating ZIP files with PHP + Apache on-the-fly in high speed?
Create a zip file using PHP class ZipArchive without writing the file to disk?
The problem is a PHP limit. Check for memory and execution time limits.