Im trying to delete a folder in an s3 bucket that is located in a folder called CreativeEngine the folder structure looks like this CreativeEngine/8943
I want to delete the folder with the called 8943 but it contains files within it. Do I need to do some kind of loop to delete the files first or can I delete the folder? I tried this but it didn't work
<?php
$itemId=$_GET['id'];
require('s3/vendor/autoload.php');
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'mybucket';
$IAM_KEY = 'mykey';
$IAM_SECRET = 'mysecret';
// Connect to AWS
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'us-east-2'
)
);
$s3Destination='CreativeEngine/'.$itemId;
$keyName = $s3Destination;
try{
$s3->deleteObject(array(
'Bucket' => $bucketName,
'Key' => $keyName
));
} catch (S3Exception $e) {
$data['message']='<li>error'.$e->getMessage().'</li>';
}
?>
This is possible via delete_all_objects($bucket, $pcre), where $pcreis an optional Perl-Compatible Regular Expression (PCRE) to filter the names against (default is PCRE_ALL, which is "/.*/i"), e.g.:
$s3 = new AmazonS3();
$response = $s3->delete_all_objects($bucket, "#myDirectory/.*#");
Related
require 'vendor/autoload.php';
use Aws\S3\S3Client;
// use Aws\S3\Exception\S3Exception;
/***************** S3 Digital Ocean connection ***************************************/
// S3 connection
$client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'endpoint' => '',
'credentials' => [
'key' => '',
'secret' => ''
],
]);
$date = date("dmy");
$new_bucketName = "s3-backup-new-".$date;
$prevdate = "070223";
$new_bucketName = "s3-backup-new-".$prevdate;
$spacesBucket = $client->listBuckets();
// Copying the files and folders
$livefolder = "s3";
$iterator = $client->getIterator('ListObjects', array('Bucket' => $livefolder));
foreach ($iterator as $obj) {
$response = $client->doesObjectExist($new_bucketName, $livefolder."/".$obj['Key']);
echo $response;
if($response!=1)
{
echo "new";
$client->copyObject([
'Bucket' => $new_bucketName,
'CopySource' => $livefolder."/".$obj['Key'],
'Key' => $obj['Key'],
]);
}
}
/****************** Delete Old Backup -5days ***********************/
$next_due_date = 's3-backup-new-'.date('dmy', strtotime("-5 days"));
// $next_due_date = 's3-backups';
$spaces = $client->listBuckets();
foreach ($spaces['Buckets'] as $space){
if($space['Name'] == $next_due_date)
{
$objects_delete = $client->getIterator('ListObjects', array('Bucket' => $next_due_date));
foreach ($objects_delete as $obj_delete)
{
$client->deleteObject([
'Bucket' => $next_due_date,
'Key' => $obj_delete['Key'],
]);
}
$client->deleteBucket([
'Bucket' => $next_due_date,
]);
}
}
echo "Copied successfully";
$response = array("Status"=>"Success", "Message"=>"Export successfully", "Data"=>"");
echo json_encode($response);
/****************** End Delete Old Backup -5days ***********************/
This is the code we have used to create S3 bucket and copy content to another folder and delete the folder.
But we have facing another issue, our original bucket contains more than 10TB of data . So we need to copy the whole data to another bucket. But it is not working. After one hour the copying is stopped , so it only copy 28gb only to another bucket. How we can copy the whole data to another bucket, with out stopping the process. Any one know please help.
I have a Laravel project and a version enabled S3 bucket. I can list the versions of all objects within the bucket using the listObjectVersions method.
My attempt to list the versions of a specific object is as follows:
$result = $client->listObjectVersions([
'Bucket' => \Config::get('filesystems.disks.s3.bucket'),
'Key' => $folder . '/' . 'test.png',
]);
This seems to get all objects within the bucket which is not what I want. Is there a way to get just one file?
I am using the AWS PHP SDK.
Thanks
You can use 'Prefix' instead of 'Key':
$client = new S3Client([
'version' => 'latest',
'region' => 'eu-west-1'
]);
$folder = $this->getS3Folder($request, $study);
$result = $client->listObjectVersions([
'Bucket' => \Config::get('filesystems.disks.s3.bucket'),
'Prefix' => $folder . '/' . 'test.png',
]);
This question already has answers here:
PutObject into directory Amazon s3 / PHP
(3 answers)
Closed 3 years ago.
I have this code below that should upload an image to my s3 bucket but I don't know how to tell it to upload to a specific folder (named videouploads) in my s3 bucket. Can anyone help?
require 'vendor/autoload.php';
use Aws\S3\S3Client;
// Instantiate an Amazon S3 client.
$s3 = new S3Client([
'version' => 'latest',
'region' => 'grabage',
'credentials' => [
'key' => 'garbage',
'secret' => 'grabage'
]
]);
$bucketName = 'cgrabage';
$file_Path = __DIR__ . '/my-image.png';
$key = basename($file_Path);
// Upload a publicly accessible file. The file size and type are determined by the SDK.
try {
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);
echo $result->get('ObjectURL');
} catch (Aws\S3\Exception\S3Exception $e) {
echo "There was an error uploading the file.\n";
echo $e->getMessage();
}
You need to put the directory information in the key.
$result = $s3->putObject([
'Bucket' => $bucketName,
'Key' => 'videouploads/' . $key,
'Body' => fopen($file_Path, 'r'),
'ACL' => 'public-read',
]);
I have used vlaim\fileupload\FileUpload; and yii\web\UploadedFile;
$image = UploadedFile::getInstance($model, 'flag');
$model->flag = new FileUpload(FileUpload::S_S3, [
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'KEY',
'secret' => 'SECRET'
],
'bucket' => 'mybucket/uploads/flags/'.$model->code
]);
$uploader = $model->flag;
$model->flag = $uploader->uploadFromFile($image)->path;
In db i'm saving the path. How to customize the url?
Now my url looks like https://s3-us-west-2.amazonaws.com/mybucket%2Fuploads%2Fflags%2Fus/uploads%5C9f%5C7e%5Cc093ad5a.png
I need the url like https://mybucket.s3.amazonaws.com/uploads/flags/us.png
S3 does not have the concept of folders, It is an object store, with key/value pairs. They key for your file would be uploads/flags/us.png
with the PHP SDK it's easy to set the key of the object.
$USAGE = "\n" .
"To run this example, supply the name of an S3 bucket and a file to\n" .
"upload to it.\n" .
"\n" .
"Ex: php PutObject.php <bucketname> <filename>\n";
if (count($argv) <= 2){
echo $USAGE;
exit();
}
$bucket = $argv[1];
$file_Path = $argv[2];
$key = basename($argv[2]);
try{
//Create a S3Client
$s3Client = new S3Client([
'region' => 'us-west-2',
'version' => '2006-03-01'
]);
$result = $s3Client->putObject([
'Bucket' => $bucket,
'Key' => $key,
'SourceFile' => $file_Path,
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
yii2 i think you need to set setFsUrl()
http://www.yiiframework.com/extension/yii2-file-upload/#hh8
setFsUrl(string $url)
(Only for Local mode)
Sets url. For example, if you set path to 'http://static.example.com' file after uploading will have URL http://static.example.com/path/to/your/file
Default to /
php $uploader->setFsPath('http://pathtoyoursite.com');
I'm trying to Download Private S3 Object and store it on website Server
Here is what I'm Trying
$s3 = new S3Client([
'version' => 'latest',
'region' => 'ap-south-1',
'credentials' => array(
'key' => '*****',
'secret' => '*******'
)
]);
$command = $s3->getCommand('GetObject', array(
'Bucket' => 'bucket_name',
'Key' => 'object_name_in_s3'
'ResponseContentDisposition' => 'attachment; filename="'.$my_file_name.'"'
));
$signedUrl = $command->createPresignedUrl('+15 minutes');
echo $signedUrl;
How can i save these files on my server
From Get an Object Using the AWS SDK for PHP:
use Aws\S3\S3Client;
$bucket = '*** Your Bucket Name ***';
$keyname = '*** Your Object Key ***';
$filepath = '*** Your File Path ***';
// Instantiate the client.
$s3 = S3Client::factory();
// Save object to a file.
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SaveAs' => $filepath
));
If you just want to download a file from the command line (instead of an app), you can use the AWS Command-Line Interface (CLI) -- it has an aws s3 cp command.
The Pre-signed URL in your code can be used to grant time-limited access to a private object stored in an Amazon S3 bucket. Typically, your application generates the URL and includes it in a web page for users to click and download the object. There is no need to use it on the server-side, because the server would have credentials that are authorized to access content in Amazon S3.