Please assist.
I am successfully uploading objects to S3 using the following code snippet:
// Send a PutObject request and get the result object.
$result = $this->s3EncryptionClient->putObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key,
'ContentType' => $mimeType,
'ContentLength' => filesize($filePath),
'ContentDisposition' => "attachment; filename='" . $fileName . "'",
'Body' => fopen($filePath ,'r')
]);
And I can successfully download the object using the following snippet:
// Download the contents of the object.
$result = $this->s3EncryptionClient->getObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key
]);
The problem comes in when i try copy the object using the following code snippet:
$result = $this->s3Client->CopyObject([
'Bucket' => $this->s3BucketName,
'CopySource' => $CopySource,
'Key' => $dstKey,
]);
The object seems to be copied incorrectly and when i try download the new object using the download code i pasted earlier in this post, The object is not found and an AWSException
resulted in a 404 Not Found response
Any idea how I can resolve the issue?
Related
I am sending file to AWS s3 using PHP SDK. I installed SDK using
composer require aws/aws-sdk-php
I am using following code
require_once('vendor/autoload.php');
$s3 = new Aws\S3\S3Client([
'region' => AWS_REGION,
'version' => 'latest',
'credentials' => [
'key' => AWS_ACCESS_KEY_ID,
'secret' => AWS_SECRET_ACCESS_KEY,
]
]);
$result = $s3->putObject([
'Bucket' => AWS_BUCKET,
'Key' => $filename,
'SourceFile' => $fileFullPath
]);
Following response, I am getting
I am trying to get status code from this response and tried different ways, but I could not get status code.
You are returned an object with a private array called "data" but you are also able to just call the data by attribute. So using $result['#metadata']['statusCode'] works just fine.
$result['#metadata']['statusCode'] == 200
According to your example.
Worked fine for me
$result['#metadata']['statusCode'] == 200
I have a little problem with AWS S3 service. I'm trying to delete whole bucket and i would like to use deleteBucketAsync().
My code:
$result = $this->s3->listObjects(array(
'Bucket' => $bucket_name,
'Prefix' => ''
));
foreach($result['Contents'] as $file){
$this->s3->deleteObjectAsync(array(
'Bucket' => $bucket_name,
'Key' => $file['Key']
));
}
$result = $this->s3->deleteBucketAsync(
[
'Bucket' => $bucket_name,
]
);
Sometimes this code works and delete whole bucket in seconds. But sometimes it doesn't.
Can someone please explain me how exacly S3 async functions work?
I am creating a gzip string and uploading it as an object to s3. However when I download the same file from s3 and decompress it locally with gunzip I get this error: gunzip: 111.gz: not in gzip format When I look at the mime_content_type returned in the file downloaded from s3 it is set as: application/zlib
Here is the code I am running to generate the gzip file and push it to s3:
for($i=0;$i<=100;$i++) {
$content .= $i . "\n";
}
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content),
'ACL' => 'authenticated-read',
'Metadata' => [
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
]
));
The strange thing is that if I view the gzip content locally before I send it to s3 I am able to decompress it and see the original string. So I must be uploading the file incorrectly, any thoughts?
According to http://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject the ContentType and ContentEncoding parameters belong on top level, and not under Metadata. So your call should look like:
$result = $this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzencode($content),
'ACL' => 'authenticated-read',
'ContentType' => 'text/plain',
'ContentEncoding' => 'gzip'
));
Also it's possible that by setting ContentType to text/plain your file might be truncated whenever a null-byte occurs. I would try with'application/gzip' if you still have problems unzipping the file.
I had a very similar issue, and the only way to make it work for our file was with a code like this (slightly changed according to your example):
$this->s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => '111.gz',
'Body' => gzcompress($content, 9, ZLIB_ENCODING_GZIP),
'ACL' => 'public-read',
'ContentType' => 'text/javascript',
'ContentEncoding' => 'gzip'
));
The relevant part being gzcompress($content, 9, ZLIB_ENCODING_GZIP), as AWS S3 wouldn't recognize the file nor serve it in the right format without the last ZLIB_ENCODING_GZIP parameter.
I tried to upload large file into Amazon s3 using PHP. I have found nice solutions on various forums but these solutions are for SDK version 1 .
http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFilePHP.html
Of course, I have found examples on Amazon API documentation. This example expects file on local disk and can not handle with input stream.
I couldn't find similar examples for the SDK for PHPv2 as shown in first link.
Did someone solved similar problem successfully?
I recently just prepared a code sample for this. In this example I am using a file, but you can use a stream as well.
use Aws\S3\S3Client;
use Aws\Common\Enum\Size;
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => '*** your-aws-access-key-id ***',
'secret' => '*** your-aws-secret-key ***'
));
$file = fopen($filename, 'r');
// 1. Create a new multipart upload and get the upload ID.
$response = $s3->createMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $keyname
);
$uploadId = $result['UploadId'];
// 2. Upload the data in parts.
$parts = array();
$partNumber = 1;
while (!feof($file)) {
$result = $s3->uploadPart(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $uploadId,
'PartNumber' => $partNumber,
'Body' => fread($file, 5 * Size::MB),
));
$parts[] = array(
'PartNumber' => $partNumber++,
'ETag' => $result['ETag'],
);
}
// 3. Complete multipart upload.
$result = $s3->completeMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $uploadId,
'Parts' => $parts,
));
$url = $result['Location'];
fclose($file);
So the AWS php sdk 2.x library has been put out recently and I've taken a turkey day plunge into upgrading from 1.5x. My first was to upgrade my S3 backup class. I've quickly run into an error:
Fatal error: Class 'EntityBody' not found in /usr/share/php/....my file here
when trying to upload a zipped file to an S3 bucket. I wrote a class to abstract the writing a bit to allow for multi-region backup, so the code below references to $this are that.
$response1 = $s3->create_object(
$this->bucket_standard,
$this->filename,
array(
'fileUpload' => $this->filename,
'encryption' => 'AES256',
//'acl' => AmazonS3::ACL_PRIVATE,
'contentType' => 'text/plain',
'storage' => AmazonS3::STORAGE_REDUCED,
'headers' => array( // raw headers
'Cache-Control' => 'max-age',
//'Content-Encoding' => 'gzip',
'Content-Language' => 'en-US'
//'Expires' => 'Thu, 01 Nov 2012 16:00:00 GMT'
),
'meta' => array(
'param1' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'param2' => $this->hostOrigin
)
)
);
The above worked fine on 1.5.x.
Now, in 2.x, I'm looking into their docs and they've changed just about everything (great...maximum sarcasm)
$s3opts=array('key'=> $this->accessKey, 'secret' => $this->secretKey,'region' => 'us-east-1');
$s3 = Aws\S3\S3Client::factory($s3opts);
so now I've got a new S3 object. And here is my 2.x syntax to do the same exact thing. My problem arises where they've (sinisterly) changed the old "fileupload" to "Body" and made it more abstract in how to actually attach a file! I've tried both and I'm thinking it has to do with the dependencies (Guzzle or Smyfony etc), but I get the error above (or substitute Stream if you like) whenever I try to execute this.
I've tried using Composer with composer.json, and the aws.phar but before I get into that, is there something dumb I'm missing?
$response1 = $s3->putObject(array(
'Bucket' => $this->bucket_standard,
'Key' => $this->filename,
'ServerSideEncryption' => 'AES256',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Body' => EntityBody::factory(fopen($this->filename, 'r')),
//'Body' => new Stream(fopen($fullPath, 'r')),
'MetaData' => array(
'BackupTime' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'HostOrigin' => $this->hostOrigin
)
));
Thanks as always,
R
Did you import the EntityBody into your namespace?
use Guzzle\Http\EntityBody;
Otherwise, you'd have to do
'Body' => \Guzzle\Http\EntityBody::factory(fopen($this->filename, 'r')),