How to rename a folder by prefix in s3 using PHP sdk - php

I am working on a project, now i need to rename a key using its prefix in s3 php sdk api. i couldn't find it, if any can help. Thanks
function moveFile($oldPath,$newPath){
$oKey = $this->getKey($oldPath);
$nKey = $this->getKey($newPath);
try{
// Copy an object.
$this->o->copyObject(array(
'Bucket' => $this->bucket,
'ACL' => 'public-read',
'Key' => $nKey,
'CopySource' => "{$this->bucket}/{$oKey}"
));
$this->deleteFile($oldPath);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
return false;
}
}

You can rename s3 files using below code :
$s3sdk = new Sdk($awsConfig);
$s3 = $s3sdk->createS3();
$s3->registerStreamWrapper();
rename($oldName, $newName);
both names need to contain the full s3 path e.g:
"s3://yourBucketName/path/to/file"
Basically registerStreamWrapper() enables PHP filesystem commands for s3 files.

I did this, you guys answered late. i did it myself but LuFFy answer is also correct.
function renameFolder($oldPath,$newPath){
$oKey = $this->getKey($oldPath);
if(strpos($oKey,'/')==false){$oKey.='/';}
//echo '<br>oKey: '.$oKey.'<br>';
try{
// Copy an object.
/*$this->o->copyObject(array(
'Bucket' => $this->bucket,
'ACL' => 'public-read',
'Key' => $nKey,
'CopySource' => "{$this->bucket}/{$oKey}"
));*/
$result = $this->o->listObjects([
'Bucket' => $this->bucket, // REQUIRED
'Prefix' => $oKey,
]);
foreach($result['Contents'] as $file){
//echo '<br>objectKey: '.$file['Key'].'<br>';
$nKey = str_replace($this->getLastKey($oldPath),$this->getLastKey($newPath),$file['Key']);
//echo '<br>nKey: '.$nKey.'<br>';
$this->o->copyObject(array(
'Bucket' => $this->bucket,
'ACL' => 'public-read',
'Key' => $nKey,
'CopySource' => "{$this->bucket}/".$file['Key'].""
));
}
$this->deleteDir($oldPath);
}catch(S3Exception $e) {
echo $e->getMessage() . "\n";
return false;
}
}

I have managed to rename existing files on the batch using below steps:
Lets say your config/filesystems.php looks like this:
'disks' => [
's3_test_bucket' => [
'driver' => 's3',
'key' => env('AWS_KEY', 'your_aws_key_here'),
'secret' => env('AWS_SECRET','your_aws_secret_here'),
'region' => env('AWS_REGION', 'your_aws_region_here'),
'version' => 'latest',
'bucket' => 'my-test-bucket',
],
];
Let's say, you have my-test-bucket on your AWS S3.
Lets say you have following files inside the my-test-bucket/test-directory directory.
i.e.
test-files-1.csv
test-files-2.csv
test-files-3.csv
Call below function to rename existing files on a selected directory on S3 Bucket.
$directoryPath = 'test-directory';
$storage = new MyStorageRepository();
$storage->renameAnyExistingFilesOnImportDirectory('my-test-bucket', 'test-directory');
Output: files should be rename as below on my-test-bucket/test-directory directory:
test-files-1--1548870936.csv
test-files-2--1548870936.csv
test-files-3--1548870936.csv
Include the below library class or methods on your class and you should be good.
use Illuminate\Support\Facades\App;
use Illuminate\Support\Facades\Storage;
class MyStorageRepository
{
public function renameAnyExistingFilesOnImportDirectory($bucket, $directoryPath)
{
$directoryPath = App::environment() . '/' . $directoryPath;
$storage = Storage::disk('s3_test_bucket');
$suffix = '--' . time(); // File suffix to rename.
if ($storage->exists($directoryPath)) {
$this->renameStorageDirectoryFiles($directoryPath, $storage, $suffix);
}
}
private function getNewFilename($filename, $suffix = null)
{
$file = (object) pathinfo($filename);
if (!$suffix) {
$suffix = '--' . time();
}
return $file->dirname . '/' . $file->filename . $suffix . '.' . $file->extension;
}
private function renameStorageDirectoryFiles($directoryPath, $storage = null, $suffix = null, $filesystemDriver = null)
{
if (!$storage) {
$storage = Storage::disk($filesystemDriver);
}
// List all the existing files from the directory
$files = $storage->files($directoryPath);
if (count($files) < 1 ) return false;
foreach($files as $file) {
// Get new filename
$newFilename = Helpers::getNewFilename($file, $suffix);
// Renamed the files
$storage->move($file, $newFilename);
}
}
}

Related

How to customize s3 uploaded files url in yii2?

I have used vlaim\fileupload\FileUpload; and yii\web\UploadedFile;
$image = UploadedFile::getInstance($model, 'flag');
$model->flag = new FileUpload(FileUpload::S_S3, [
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'KEY',
'secret' => 'SECRET'
],
'bucket' => 'mybucket/uploads/flags/'.$model->code
]);
$uploader = $model->flag;
$model->flag = $uploader->uploadFromFile($image)->path;
In db i'm saving the path. How to customize the url?
Now my url looks like https://s3-us-west-2.amazonaws.com/mybucket%2Fuploads%2Fflags%2Fus/uploads%5C9f%5C7e%5Cc093ad5a.png
I need the url like https://mybucket.s3.amazonaws.com/uploads/flags/us.png
S3 does not have the concept of folders, It is an object store, with key/value pairs. They key for your file would be uploads/flags/us.png
with the PHP SDK it's easy to set the key of the object.
$USAGE = "\n" .
"To run this example, supply the name of an S3 bucket and a file to\n" .
"upload to it.\n" .
"\n" .
"Ex: php PutObject.php <bucketname> <filename>\n";
if (count($argv) <= 2){
echo $USAGE;
exit();
}
$bucket = $argv[1];
$file_Path = $argv[2];
$key = basename($argv[2]);
try{
//Create a S3Client
$s3Client = new S3Client([
'region' => 'us-west-2',
'version' => '2006-03-01'
]);
$result = $s3Client->putObject([
'Bucket' => $bucket,
'Key' => $key,
'SourceFile' => $file_Path,
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
yii2 i think you need to set setFsUrl()
http://www.yiiframework.com/extension/yii2-file-upload/#hh8
setFsUrl(string $url)
(Only for Local mode)
Sets url. For example, if you set path to 'http://static.example.com' file after uploading will have URL http://static.example.com/path/to/your/file
Default to /
php $uploader->setFsPath('http://pathtoyoursite.com');

unable to upload file to sub-folder of main bucket

I am trying to upload error file in AWSS3 but it shows error like "The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "test9011960909.s3.amazonaws.com"."
i also specified 'region' => 'us-east-1' but still same error occurs.
it is working when i specify
'Bucket' => $this->bucket,
but i wanted to upload file in sub-folder of main bucket
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
i already applied approved answer from AWS S3: The bucket you are attempting to access must be addressed using the specified endpoint
but still getting same error, i am using php
Code :
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
class AWSS3Factory {
private $bucket;
private $keyname;
public function __construct() {
$this->bucket = AWSS3_BucketName;
$this->keyname = AWSS3_AccessKey;
// Instantiate the client.
}
public function UploadFile($FullFilePath,$PrefixFolderPath="") {
try {
$s3 = S3Client::factory(array(
'credentials' => array(
'key' => MYKEY,
'secret' => MYSECKEY,
'region' => 'eu-west-1',
)
));
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
'Key' => $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));
return true;
// Print the URL to the object.
//echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
}
}
You must create s3 instance in another way, like this:
$s3 = S3Client::factory([
'region' => '',
'credentials' => ['key' => '***', 'secret' => '***'],
'version' => 'latest',
]);
You must add $PrefixFolderPath not to 'Bucket' but to 'Key':
$result = $s3->putObject(array(
'Bucket' => $this->bucket,
'Key' => $PrefixFolderPath . "/" . $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));

How to work with aws-sdk-php in zf2 project?

I actually know that there is a aws-sdk-php module for ZF2 which is named aws-sdk-php-zf2, but I have a part using the simple sdk and I would like to work with it inside my zf2 controllers without having 2 sdks; one for simple PHP and another for ZF2 scripts. Is there any way to make it work?
Here is the way I work using the aws-sdk in a simle PHP script:
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// Instantiate an S3 client
$client = S3Client::factory(array(
'credentials' => array(
'key' => 'key',
'secret' => 'secret_key',
)
));
$bucket = 'bucket_name';
$keyname = 'project_name/file.ext';
$result = $client->deleteObject(array(
'Bucket' => $bucket,
'Key' => $keyname
));
print_r($result);
How could I achieve this?
Once installed via Composer:
1) Put it into the public/init_autoloader.php file to set the library available all over the app, this is mine:
// Composer autoloading
if (file_exists('vendor/autoload.php')) {
$loader = include 'vendor/autoload.php';
}
$zf2Path = false;
if (is_dir('vendor/ZF2/library')) {
$zf2Path = 'vendor/ZF2/library';
} elseif (getenv('ZF2_PATH')) { //Support for ZF2_PATH environment variable or git submodule
$zf2Path = getenv('ZF2_PATH');
} elseif (get_cfg_var('zf2_path')) { //Support for zf2_path directive value
$zf2Path = get_cfg_var('zf2_path');
}
if ($zf2Path) {
if (isset($loader)) {
$loader->add('Zend', $zf2Path);
} else {
include $zf2Path . '/Zend/Loader/AutoloaderFactory.php';
Zend\Loader\AutoloaderFactory::factory(array(
'Zend\Loader\StandardAutoloader' => array(
'autoregister_zf' => true
)
));
}
}
2) Use it in the controller as you wish, in my case the following is a private function inside a controller:
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception as S3Exception;
...
private function s3UploadFile($id, $invalidation=false, $file = null, $content = null){
$response = '';
//check if the file already exists in S3, if not then build it
try {
$s3Client = S3Client::factory(array(
'key' => $this->config['aws']['key'],
'secret' => $this->config['aws']['secret'],
'region' => $this->config['aws']['region']
));
if (!$s3Client->doesObjectExist('clients','/' . $id . '/' . $file))
$s3Client->putObject(array(
'Bucket' => 'clients',
'Key' => '/' . $clientId . '/' . $file,
'Body' => $content,
'ACL' => 'public-read'
));
} catch (S3Exception $e) {
$response = 'error';
}
return $response;
}
...
I hope this helps you out.

Upload file on amazon S3 with PHP SDK

I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}

phpqrcode + save cached file to Amazon instead of folder on server

What I do now is creating a QR code image (PNG) and save a copy of it in a folder on the root of my server. Now I would like to save this image not on my server but on my amazon bucket.
I know how I can save a file on amazon but I can't figure out how to make this work together.
This is my original code for saving it in a root folder on my server:
$fileName = $quiz_url . '.png';
$pngAbsoluteFilePath = APPLICATION_PATH . '/../public/qrcodecaches/' . $fileName;
$urlRelativeFilePath = '/qrcodecaches/' . $fileName;
// generating
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
And this is how I save a file on Amazon:
$bucket = 'mybucket';
$map = 'qrcodecaches';
$client = S3Client::factory(array(
'key' => 'mykey',
'secret' => 'mysecret',
));
$fileName = $quiz_url . '.png';
$keyname = $map . '/' . $fileName;
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => fopen('/path/to/file', 'r'),
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
But what do I need to save in the 'Body' tag? Can somebody help me with this?
You can create the image file passing a temporary file name like this:
$pngAbsoluteFilePath = tempnam(sys_get_temp_dir(), 'qr_code_');
if (!file_exists($pngAbsoluteFilePath)) {
QRcode::png('http://mysitelink.com/s/'.$quiz_url, $pngAbsoluteFilePath, 'L', 4, 2);
}
then all you need to do is to get that file contents and pass it in to the Body of the object in bucket:
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'Body' => file_get_contents($pngAbsoluteFilePath), // like this
'ACL' => 'public-read'
));
}
catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
After you're done - in case of successful upload - you can delete the temporary file. It is not necessary, however recommended because it will fill your /tmp with QR images until your server is restarted:
if ( empty($e) ) {
unlink($pngAbsoluteFilePath);
}
This worked well for me.
Amazon S3Client documentation suggests that you can poll to check that the upload was successful.
$client->waitUntilObjectExists(array(
'Bucket' => $bucket,
'Key' => $keyname
));
Alternatively if you're already writing the file to disk on your server you can supply S3Client with the path to that file and let it handle the filestream itself.
try {
$client->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => '/path/to/file',
'ACL' => 'public-read'
));
} catch (S3Exception $e) {
echo "There was an error uploading the file.\n";
}
Source: http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.S3Client.html#_putObject

Categories