I'm trying to delete the folder created in a bucket in Amazon S3 and it gives the error
An unexpected error has occurred. Please try again.
How can I delete the folder?
First you need to understand that there is nothing like folder in Amazon s3 bucket
what you see is object which behaves like folder
one/ // so what you see folder is this but its separate object
one/abc.png
one/tow/
one/tow/a.zip
to delete folder you need to delete every object start with one/ and you can do that by deleteMatchingObjects() function
$s3 = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials.ini' => [
'key' => $credentials['key'],
'secret' => $credentials['secret'],
],
]);
/* this is what you need*/
$s3->deleteMatchingObjects($bucket, $obj);
I have used phpsdk v3
I am using below code in s3.php class. check it out.
/**
* Delete an empty bucket
*
* #param string $bucket Bucket name
* #return boolean
*/
public function deleteBucket($bucket = '') {
$rest = new S3Request('DELETE', $bucket);
$rest = $rest->getResponse();
if ($rest->error === false && $rest->code !== 204)
$rest->error = array('code' => $rest->code, 'message' => 'Unexpected HTTP status');
if ($rest->error !== false) {
trigger_error(sprintf("S3::deleteBucket({$bucket}): [%s] %s",
$rest->error['code'], $rest->error['message']), E_USER_WARNING);
return false;
}
return true;
}
Related
We upload contents to S3 private bucket. After uploading contents we access them through presigned url. MP4, images are working fine when we access that url through the browser. But when we try to access SWFs and PDFs, browser prompts to download content.
And also it won't happen when we try to access assets from public bucket.
Is it default behavior or is there any solution for that?
I check this doc
code to get url
public function getPresignedUrl($filename, $expires,$bucket=NULL)
{
if($bucket==NULL){
$bucket=$this->bucket;
}
$command = $this->getClient()->getCommand('GetObject', ['Bucket' =>$bucket , 'Key' => $filename]);
$request = $this->getClient()->createPresignedRequest($command, $expires);
return (string) $request->getUri();
}
=============================Update 1=================================
We are using 'upload' function of AWS sdk to upload swfs, pdfs and also mp4s.
public function upload($filename, $source, $acl = null, array $options = [])
{
if($this->getClient()->upload(
$this->bucket,
$filename,
$source,
!empty($acl) ? $acl : $this->defaultAcl,
$options
)){
return true;
}else{
return false;
}
}
Thanks
When uploading the file, S3 Client will try to determine the correct content type if one hasn't been set .
If no content type is provided and cannot be determined by the filename, the default content type, "application/octet-stream", will be used, thus the browser promts you to download the file.
have a look at http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
$s3->create_object($bucket, $file_name, array(
// other things
'contentType' => 'application/pdf',
));
As Vamsi said, I upload content with mime type was fixed my issue.
public function uploadPut($filename, $source, $acl = null,$mime=null,$storage='REDUCED_REDUNDANCY', array $options = []){
$result = $this->getClient()->putObject(array(
'Bucket' => $this->bucket,
'Key' => $filename,
'SourceFile' => $source,
'ContentType' => $mime,
'ACL' => $acl,
'StorageClass' => $storage,
));
}
call function
uploadPut($file->name, $file->name, null, $file->mimeType);
I am trying to read the data from a txt file on my amazon AWS bucket. But the body key in the response array is shown as NULL. My code -
function s3_file_get_contents($path, $private = TRUE, $bucket = '') {
require_once(CODE_BASE_DIR . '/ds_engine/docSuggest/external/aws-sdk-3/aws-autoloader.php');
try {
$s3Client = new Aws\S3\S3Client(array('region' => S3_ENDPOINT_REGION, 'version' => S3_ENDPOINT_VERSION,
'credentials' => array(
'key' => S3_SUGGESTADOC_API_KEY,
'secret' => S3_SUGGESTADOC_API_SECRET,
),
));
$result = $s3Client->getObject(array(
'Bucket' => $private ? S3_BUCKET_DOCSUGGEST : S3_BUCKET_SUGGESTADOC,
'Key' => $path,
));
} catch (Exception $e) {
$error = $e->getMessage();
log_message('ERROR', '['.__FUNCTION__.'] Exception: '.$error);
}
die(print_array($result['body']));
return $error ? $error : $result['body'];
}
The file contains some text but nothing is displayed in the console. Rest assured, I have setup the connection properly and there is no issues in that. I am able to download the file but just not read from it.
P.S - The response metadata has an object URL. Using that the file can be downloaded. So I guess I am hitting the correct path but still no success.
The data is in $result['Body'], not in $result['body'].
Look at the documentation:
http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html#downloading-objects
Use var_dump($result) to understand better than structure of the response.
I wrote the following code to launch the AWS Instance. It works fine. I am able to launch AWS instance. But I need to wait, until the instance's status check is 2/2 or the instance is ready to be invoked.
public function launch_instance() {
$isInstanceLaunched = array('status' => false);
try {
if(!empty($this->ec2Client)) {
/**
* 1) EbsOptimized : Indicates whether the instance is optimized for EBS I/O.
* This optimization provides dedicated throughput to Amazon EBS and an optimized
* configuration stack to provide optimal EBS I/O performance. This optimization isn't
* available with all instance types. Additional usage charges apply when using an
* EBS-optimized instance.
*/
$result = $this->ec2Client->runInstances([
'AdditionalInfo' => 'Launched from API',
'DisableApiTermination' => false,
'ImageId' => 'ami-8xaxxxe8', // REQUIRED
'InstanceInitiatedShutdownBehavior' => 'stop',
'InstanceType' => 't2.micro',
'MaxCount' => 1, // REQUIRED
'MinCount' => 1, // REQUIRED,
'EbsOptimized' => false, // SEE COMMENT 1
'KeyName' => 'TestCloudKey',
'Monitoring' => [
'Enabled' => true // REQUIRED
],
'SecurityGroups' => ['TestGroup']
]);
if($result) {
$metadata = $result->get('#metadata');
$statusCode = $metadata['statusCode'];
//Verify if the code is 200
if($statusCode == 200) {
$instanceData = $result->get('Instances');
$instanceID = $instanceData[0]['InstanceId'];
// Give a name to the launched instance
$tagCreated = $this->ec2Client->createTags([
'Resources' => [$instanceID],
'Tags' => [
[
'Key' => 'Name',
'Value' => 'Created from API'
]
]
]);
$instance = $this->ec2Client->describeInstances([
'InstanceIds' => [$instanceID]
]);
$instanceInfo = $instance->get('Reservations')[0]['Instances'];
// Get the Public IP of the instance
$instancePublicIP = $instanceInfo[0]['PublicIpAddress'];
// Get the Public DNS of the instance
$instancePublicDNS = $instanceInfo[0]['PublicDnsName'];
// Get the instance state
$instanceState = $instanceInfo[0]['State']['Name'];
$instanceS = $this->ec2Client->describeInstanceStatus([
'InstanceIds' => [$instanceID]
]);
try {
$waiterName = 'InstanceRunning';
$waiterOptions = ['InstanceId' => $instanceID];
$waiter = $this->ec2Client->getWaiter($waiterName,$waiterOptions);
// Initiate the waiter and retrieve a promise.
$promise = $waiter->promise();
// Call methods when the promise is resolved.
$promise
->then(function () {
// Waiter Completed
})
->otherwise(function (\Exception $e) {
echo "Waiter failed: " . $e . "\n";
});
// Block until the waiter completes or fails. Note that this might throw
// a RuntimeException if the waiter fails.
$promise->wait();
}catch(RuntimeException $runTimeException) {
$isInstanceLaunched['side-information'] = $runTimeException->getMessage();
}
$isInstanceLaunched['aws-response'] = $result;
$isInstanceLaunched['instance-public-ip'] = $instancePublicIP;
$isInstanceLaunched['status'] = true;
}
}
}else {
$isInstanceLaunched['message'] = 'EC2 client not initialized. Call init_ec2 to initialize the client';
}
}catch(Ec2Exception $ec2Exception){
$isInstanceLaunched['message'] = $ec2Exception->getMessage().'-- FILE --'.$ec2Exception->getFile().'-- LINE --'.$ec2Exception->getLine();
}catch(Exception $exc) {
$isInstanceLaunched['message'] = $exc->getMessage().'-- FILE -- '.$exc->getFile().'-- LINE -- '.$exc->getLine();
}
return $isInstanceLaunched;
}
I have used the waiter named InstanceRunning but that doesn't help. How do I know that the instance is ready to be invoked or the status check is 2/2?
have you try to replace
$waiter = $this->ec2Client->getWaiter($waiterName,$waiterOptions);
with
$this->ec2Client->waitUntil($waiterName, $waiterOptions);
$result = $ec2->describeInstances(array(
'InstanceIds' => array($instanceId),
));
I did not test for some time but it used to work for me
You need to use two waiters: first "InstanceRunning" and then "InstanceStatusOk"
I wrote the following code to check if the instance is ready to be invoked. I do a polling to check if the statusCheck is ok.
/**
* The following block checks if the instance is ready to be invoked
* or StatusCheck of the instance equals 2/2
* Loop will return:
* - If the status check received is 'ok'
* - If it exceeds _amazon_client::MAX_ATTEMPTS_TO_EC2_SCHECK
*/
$statusCheckAttempt = 0;
do {
// Sleep for _amazon_client::POLLING_SECONDS
sleep(_amazon_client::POLLING_SECONDS);
// Query the instance status
$instanceS = $this->ec2Client->describeInstanceStatus([
'InstanceIds' => [$instanceID]
]);
$statusCheck = $instanceS->get('InstanceStatuses')[0]['InstanceStatus']['Status'];
$statusCheckAttempt++;
}while($statusCheck != 'ok' ||
$statusCheckAttempt >= _amazon_client::MAX_ATTEMPTS_TO_EC2_SCHECK);
Note: When the instance has just been launched, $instanceS->get('InstanceStatuses')[0] returns an empty array. So I wait some seconds, before I could actually get the status.
Please suggest if there are better methods to achieve this.
I know how do upload an object to Aws S3 Bucket like this:
try {
$oClientAws->putObject(array(
'Bucket' => 'bucket_test',
'Key' => 'fileName.jpg',
'Body' => fopen('path/to/file/fileName.jpg', 'r'),
'ACL' => 'public-read',
));
}
catch (Aws\Exception\S3Exception $e) {}
But i don't know how to download an object i can use $oClientAws->getObject(parms...) and change the content type of the header but this just show my file on the browser, but don't download the file.
tks!
File can be downloaded from s3 bucket using getObject method of s3client API
/**
* Gets file stored in Amazon s3 bucket.
*
* #param string $awsAccesskey , access key used to access the Amazon bucket.
* #param string $awsSecretKey , secret access key used to access the Amazon bucket.
* #param string $bucketName , bucket name from which the file is to be accessed.
* #param string $file_to_fetch , name of the file to be fetched, if the file is with in a folder it should also include the folder name.
* #param string $path_to_save , path where the file received should be saved.
* #return boolean true if the file is successfully received and saved else returns false.
*/
function get_from_s3_bucket( $awsAccesskey, $awsSecretKey, $bucketName, $file_to_fetch, $path_to_save ) {
try {
$bucket = $bucketName;
require_once('S3.php');
$s3 = new S3( $awsAccesskey, $awsSecretKey );
$object = $s3->getObject( $bucket, $file_to_fetch, $path_to_save );
if ( $object->code == 200 ) {
return true;
} else {
return false;
}
} catch ( Exception $e ) {
return false;
}
}
Refer the below link for more guidance:
http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html
Using the S3 standalone class (which I have found isn't much different from the AWS SDK) getObject has a saveTo param where you pass a filename to save the file to... Check out the method:
/**
* Get an object
*
* #param string $bucket Bucket name
* #param string $uri Object URI
* #param mixed $saveTo Filename or resource to write to
* #return mixed
*/
public static function getObject($bucket, $uri, $saveTo = false)
{
$rest = new S3Request('GET', $bucket, $uri, self::$endpoint);
if ($saveTo !== false)
{
if (is_resource($saveTo))
$rest->fp =& $saveTo;
else
if (($rest->fp = #fopen($saveTo, 'wb')) !== false)
$rest->file = realpath($saveTo);
else
$rest->response->error = array('code' => 0, 'message' => 'Unable to open save file for writing: '.$saveTo);
}
if ($rest->response->error === false) $rest->getResponse();
if ($rest->response->error === false && $rest->response->code !== 200)
$rest->response->error = array('code' => $rest->response->code, 'message' => 'Unexpected HTTP status');
if ($rest->response->error !== false)
{
self::__triggerError(sprintf("S3::getObject({$bucket}, {$uri}): [%s] %s",
$rest->response->error['code'], $rest->response->error['message']), __FILE__, __LINE__);
return false;
}
return $rest->response;
}
Here's a link to obtain the class: https://aws.amazon.com/code/1448
Hope this helps.
I am trying to copy a 1TB file from one bucket to another. I know that this can be done easily if I log into the AWS S3 panel but I would like to do it using PHP.
I am using the following AWS S3 class from github
public static function copyObject($srcBucket, $srcUri, $bucket, $uri, $acl = self::ACL_PRIVATE, $metaHeaders = array(), $requestHeaders = array(), $storageClass = self::STORAGE_CLASS_STANDARD)
{
$rest = new S3Request('PUT', $bucket, $uri, self::$endpoint);
$rest->setHeader('Content-Length', 0);
foreach ($requestHeaders as $h => $v) $rest->setHeader($h, $v);
foreach ($metaHeaders as $h => $v) $rest->setAmzHeader('x-amz-meta-'.$h, $v);
if ($storageClass !== self::STORAGE_CLASS_STANDARD) // Storage class
$rest->setAmzHeader('x-amz-storage-class', $storageClass);
$rest->setAmzHeader('x-amz-acl', $acl);
$rest->setAmzHeader('x-amz-copy-source', sprintf('/%s/%s', $srcBucket, rawurlencode($srcUri)));
if (sizeof($requestHeaders) > 0 || sizeof($metaHeaders) > 0)
$rest->setAmzHeader('x-amz-metadata-directive', 'REPLACE');
$rest = $rest->getResponse();
if ($rest->error === false && $rest->code !== 200)
$rest->error = array('code' => $rest->code, 'message' => 'Unexpected HTTP status');
if ($rest->error !== false)
{
self::__triggerError(sprintf("S3::copyObject({$srcBucket}, {$srcUri}, {$bucket}, {$uri}): [%s] %s",
$rest->error['code'], $rest->error['message']), __FILE__, __LINE__);
return false;
}
return isset($rest->body->LastModified, $rest->body->ETag) ? array(
'time' => strtotime((string)$rest->body->LastModified),
'hash' => substr((string)$rest->body->ETag, 1, -1)
) : false;
}
I am using it in my PHP code as follows:
$s3 = new S3(AWS_ACCESS_KEY, AWS_SECRET_KEY);
$s3->copyObject($srcBucket, $srcName, $bucketName, $saveName, S3::ACL_PUBLIC_READ_WRITE);
I'm getting no error_log. What am I doing wrong that I am missing, please?
At 1 TB, the object is too large to copy in a single operation. To quote from the S3 REST API documentation:
You can store individual objects of up to 5 TB in Amazon S3. You create a copy of your object up to 5 GB in size in a single atomic operation using this API. However, for copying an object greater than 5 GB, you must use the multipart upload API.
Unfortunately, it doesn't appear that the S3 class you're using supports multipart uploads, so you'll need to use something else. I'd strongly recommend that you use Amazon's AWS SDK for PHP — it's a bit bigger and more complex than the one you're using right now, but it supports the entirety of the S3 API (as well as other AWS services!), so it'll be able to handle this operation.