How do I upload a gzip object to s3? - php

I am creating a gzip string and uploading it as an object to s3. However when I download the same file from s3 and decompress it locally with gunzip I get this error: gunzip: 111.gz: not in gzip format When I look at the mime_content_type returned in the file downloaded from s3 it is set as: application/zlib
Here is the code I am running to generate the gzip file and push it to s3:
for($i=0;$i<=100;$i++) {
$content .= $i . "\n";
}
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content),
'ACL' => 'authenticated-read',
'Metadata' => [
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
]
));
The strange thing is that if I view the gzip content locally before I send it to s3 I am able to decompress it and see the original string. So I must be uploading the file incorrectly, any thoughts?

According to http://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject the ContentType and ContentEncoding parameters belong on top level, and not under Metadata. So your call should look like:
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzencode($content),
'ACL' => 'authenticated-read',
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
));
Also it's possible that by setting ContentType to text/plain your file might be truncated whenever a null-byte occurs. I would try with'application/gzip' if you still have problems unzipping the file.

I had a very similar issue, and the only way to make it work for our file was with a code like this (slightly changed according to your example):
$this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content, 9, ZLIB_ENCODING_GZIP),
'ACL' => 'public-read',
'ContentType' => 'text/javascript',
'ContentEncoding' => 'gzip'
));
The relevant part being gzcompress($content, 9, ZLIB_ENCODING_GZIP), as AWS S3 wouldn't recognize the file nor serve it in the right format without the last ZLIB_ENCODING_GZIP parameter.

Related

AWS s3Client->copyObject succeeds but the copied object loses is garbled

Please assist.
I am successfully uploading objects to S3 using the following code snippet:
// Send a PutObject request and get the result object.
$result = $this->s3EncryptionClient->putObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key,
'ContentType' => $mimeType,
'ContentLength' => filesize($filePath),
'ContentDisposition' => "attachment; filename='" . $fileName . "'",
'Body' => fopen($filePath ,'r')
]);
And I can successfully download the object using the following snippet:
// Download the contents of the object.
$result = $this->s3EncryptionClient->getObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key
]);
The problem comes in when i try copy the object using the following code snippet:
$result = $this->s3Client->CopyObject([
'Bucket' => $this->s3BucketName,
'CopySource' => $CopySource,
'Key' => $dstKey,
]);
The object seems to be copied incorrectly and when i try download the new object using the download code i pasted earlier in this post, The object is not found and an AWSException
resulted in a 404 Not Found response
Any idea how I can resolve the issue?

SourceFile Amazon AWS S3

I'm trying to upload a file to my bucket. I am able to upload with Body but not SourceFile. Here's my method:
$pathToFile='/explicit/path/to/file.jpg'
// Upload an object by streaming the contents of a file
$result = $s3Client->putObject(array(
'Bucket' => $bucket,
'Key' => 'test.jpg',
'SourceFile' => $pathToFile,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
but I get this error:
You must specify a non-null value for the Body or SourceFile parameters.
I've tried different types of files and keep getting this error.
The issue had to do with not giving a good path. Looks like file_exists() only checks locally, meaning that it had to be within localhost's index.
Change
'SourceFile' => $pathToFile,
to
'Body' => $pathToFile,

Amazon S3 PHP SDK: uploaded images have wrong mime types

I tried the following two functions, none of them works, they could upload the file to S3 but if you visit the uploaded file from a browser you could see it's being treated as application/octet-stream, that's not right...
$s3->upload('mybucket', // bucket
$filename, // key
$imagebinarydata, // body
'public-read', // acl
array('contentType' => 'image/jpeg')); // options
And
$s3->putObject(array(
'Bucket' => 'mybucket',
'Key' => $filename,
'ACL' => 'public-read',
'contentType' => 'image/jpeg',
'Body' => $imagebinarydata));
I'm using the latest AWS.PHAR
You need to use uppercase ContentType with putObject as described in the docs:
'ContentType' => 'image/jpeg'

Set Cache-Control HTTP Header for S3 Objects from PHP AWS SDK

I using the Amazon SDK for PHP and trying to set Cache-control Header on the image. When I try to add it via MetaData = array("Cache-Control") it changes it to be x-amz-meta-cache-control when I login to the S3 bucket, and when I download the file, there is no Cache-control set. But if I manually change this setting, the Cache-control works perfectly. Is there some parameter I missing that I can use to set HTTP Request Headers programmatically on upload? I'm using the PutObject method. I believe the AWS SDK is from 2013.
The cache control isn't set via the "MetaData" index, "CacheControl" is at the same level as "MetaData", not contained within it.
http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.S3Client.html#_putObject
You'd use something like this as your configuration array for the putObject() method...
$s3client->putObject(array(
'Bucket' => '...',
'key' => '...',
'body' => '...',
'CacheControl' => 'max-age=172800',
'MetaData' => array(
'metaKey' => 'metaValue',
'metaKey' => 'metaValue'
)));
For the upload() method...
$s3client->upload(
'bucket',
'key',
fopen('sourcefile','r'),
'public-read',
array('params' => array(
'CacheControl' => 'max-age=172800',
'Metadata' => array(
'metaKey' => 'metaValue',
'metaKey' => 'metaValue'
))));
Also, it's worth pointing out that upload() will wrap putObject() for files of 5MB in size, otherwise it will initiate a multipart upload request.
If you want to add the CacheControl header to an item already in your bucket, use the SDK's copyObject method. Set the MetadataDirective param to REPLACE to make the item overwrite itself.
I noticed one weird thing: I had to set the ContentType header too, even though it was already set. Otherwise the image would not display inline in the browser but be offered as a download.
$result = $s3->copyObject(array(
'ACL' => 'public-read',
'Bucket' => $bucket, // target bucket
'CacheControl' => 'public, max-age=86400',
'ContentType' => 'image/jpeg', // !!
'CopySource' => urlencode($bucket . '/' . $key),
'Key' => $key, // target file name
'MetadataDirective' => 'REPLACE'
));

AWS PHP SDK Version 2 S3 putObject Error

So the AWS php sdk 2.x library has been put out recently and I've taken a turkey day plunge into upgrading from 1.5x. My first was to upgrade my S3 backup class. I've quickly run into an error:
Fatal error: Class 'EntityBody' not found in /usr/share/php/....my file here
when trying to upload a zipped file to an S3 bucket. I wrote a class to abstract the writing a bit to allow for multi-region backup, so the code below references to $this are that.
$response1 = $s3->create_object(
$this->bucket_standard,
$this->filename,
array(
'fileUpload' => $this->filename,
'encryption' => 'AES256',
//'acl' => AmazonS3::ACL_PRIVATE,
'contentType' => 'text/plain',
'storage' => AmazonS3::STORAGE_REDUCED,
'headers' => array( // raw headers
'Cache-Control' => 'max-age',
//'Content-Encoding' => 'gzip',
'Content-Language' => 'en-US'
//'Expires' => 'Thu, 01 Nov 2012 16:00:00 GMT'
),
'meta' => array(
'param1' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'param2' => $this->hostOrigin
)
)
);
The above worked fine on 1.5.x.
Now, in 2.x, I'm looking into their docs and they've changed just about everything (great...maximum sarcasm)
$s3opts=array('key'=> $this->accessKey, 'secret' => $this->secretKey,'region' => 'us-east-1');
$s3 = Aws\S3\S3Client::factory($s3opts);
so now I've got a new S3 object. And here is my 2.x syntax to do the same exact thing. My problem arises where they've (sinisterly) changed the old "fileupload" to "Body" and made it more abstract in how to actually attach a file! I've tried both and I'm thinking it has to do with the dependencies (Guzzle or Smyfony etc), but I get the error above (or substitute Stream if you like) whenever I try to execute this.
I've tried using Composer with composer.json, and the aws.phar but before I get into that, is there something dumb I'm missing?
$response1 = $s3->putObject(array(
'Bucket' => $this->bucket_standard,
'Key' => $this->filename,
'ServerSideEncryption' => 'AES256',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Body' => EntityBody::factory(fopen($this->filename, 'r')),
//'Body' => new Stream(fopen($fullPath, 'r')),
'MetaData' => array(
'BackupTime' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'HostOrigin' => $this->hostOrigin
)
));
Thanks as always,
R
Did you import the EntityBody into your namespace?
use Guzzle\Http\EntityBody;
Otherwise, you'd have to do
'Body' => \Guzzle\Http\EntityBody::factory(fopen($this->filename, 'r')),

Categories