unable to upload file to sub-folder of main bucket - php

I am trying to upload error file in AWSS3 but it shows error like "The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "test9011960909.s3.amazonaws.com"."
i also specified 'region' => 'us-east-1' but still same error occurs.
it is working when i specify
'Bucket' => $this->bucket,
but i wanted to upload file in sub-folder of main bucket
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
i already applied approved answer from AWS S3: The bucket you are attempting to access must be addressed using the specified endpoint
but still getting same error, i am using php
Code :
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
class AWSS3Factory {
private $bucket;
private $keyname;
public function __construct() {
$this->bucket = AWSS3_BucketName;
$this->keyname = AWSS3_AccessKey;
// Instantiate the client.
}
public function UploadFile($FullFilePath,$PrefixFolderPath="") {
try {
$s3 = S3Client::factory(array(
'credentials' => array(
'key' => MYKEY,
'secret' => MYSECKEY,
'region' => 'eu-west-1',
)
));
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
'Key' => $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));
return true;
// Print the URL to the object.
//echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
}
}

You must create s3 instance in another way, like this:
$s3 = S3Client::factory([
'region' => '',
'credentials' => ['key' => '***', 'secret' => '***'],
'version' => 'latest',
]);

You must add $PrefixFolderPath not to 'Bucket' but to 'Key':
$result = $s3->putObject(array(
'Bucket' => $this->bucket,
'Key' => $PrefixFolderPath . "/" . $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));

Related

I Unable to move images in S3 bucket from our main server

I want to upload my images in the amazon s3 bucket. My images database uploaded to another web server. I am unable to move it to the S3 bucket. I referred to some solutions from StackOverflow.
Here is my code I have tried:
use \usr\share\php\AWSSDKforPHP\S3;
use \usr\share\php\AWSSDKforPHP\S3\S3Client;
use \usr\share\php\AWSSDKforPHP\S3\Exception\S3Exception;
$bucket = 'cdn.example.com';
$keyname = $newfilename;
$s3 = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => "**********************",
'secret' => "**********************************"
]
]);
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $oldfilepath,
'ACL' => 'public-read'
]);
// Print the URL to the object.
echo $result['ObjectURL'] . PHP_EOL;
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
Showing error in output:
Fatal error: Class 'usr\share\php\AWSSDKforPHP\S3\S3Client' not found in /var/www/html/cron-jobs/shift-img.php on line 35

How to customize s3 uploaded files url in yii2?

I have used vlaim\fileupload\FileUpload; and yii\web\UploadedFile;
$image = UploadedFile::getInstance($model, 'flag');
$model->flag = new FileUpload(FileUpload::S_S3, [
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'KEY',
'secret' => 'SECRET'
],
'bucket' => 'mybucket/uploads/flags/'.$model->code
]);
$uploader = $model->flag;
$model->flag = $uploader->uploadFromFile($image)->path;
In db i'm saving the path. How to customize the url?
Now my url looks like https://s3-us-west-2.amazonaws.com/mybucket%2Fuploads%2Fflags%2Fus/uploads%5C9f%5C7e%5Cc093ad5a.png
I need the url like https://mybucket.s3.amazonaws.com/uploads/flags/us.png
S3 does not have the concept of folders, It is an object store, with key/value pairs. They key for your file would be uploads/flags/us.png
with the PHP SDK it's easy to set the key of the object.
$USAGE = "\n" .
"To run this example, supply the name of an S3 bucket and a file to\n" .
"upload to it.\n" .
"\n" .
"Ex: php PutObject.php <bucketname> <filename>\n";
if (count($argv) <= 2){
echo $USAGE;
exit();
}
$bucket = $argv[1];
$file_Path = $argv[2];
$key = basename($argv[2]);
try{
//Create a S3Client
$s3Client = new S3Client([
'region' => 'us-west-2',
'version' => '2006-03-01'
]);
$result = $s3Client->putObject([
'Bucket' => $bucket,
'Key' => $key,
'SourceFile' => $file_Path,
]);
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
yii2 i think you need to set setFsUrl()
http://www.yiiframework.com/extension/yii2-file-upload/#hh8
setFsUrl(string $url)
(Only for Local mode)
Sets url. For example, if you set path to 'http://static.example.com' file after uploading will have URL http://static.example.com/path/to/your/file
Default to /
php $uploader->setFsPath('http://pathtoyoursite.com');

Amazon s3 SDK for php can not get Bucket object list

I was getting list of object from a bucket but it getting error of endpoint.
define('AWS_KEY', 'xxxxxx');
define('AWS_SECRET_KEY', 'x+x/xxxxxxxx/');
define('AWS_CANONICAL_ID','xx');
define('AWS_CANONICAL_NAME', 'xxxxx');
$HOST = 's3.amazonaws.com';
require_once 'php_plugins/aws/v1/sdk.class.php';
$Connection = new AmazonS3(array(
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY,
'canonical_id' => AWS_CANONICAL_ID,
'canonical_name' => AWS_CANONICAL_NAME,
));
$ListResponse = $Connection->list_buckets();
$Buckets = $ListResponse->body->Buckets->Bucket;
foreach ($Buckets as $Bucket) {
echo $Bucket->Name . "\t" . $Bucket->CreationDate . "\n";
$response = $Connection->list_objects($Bucket->Name);
}
I am getting response.
[body] => CFSimpleXML Object
(
[Code] => PermanentRedirect
[Message] => The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
[Bucket] => pics.online.com
[Endpoint] => pics.online.com.s3.amazonaws.com
[RequestId] => 5F102571A54DA3BA
[HostId] => tBBxwxfUbdlV+m1R/Z9BnjLViyjROdzXrhPfc28WHaZYo/1zAwof2C0G5CVpZvkP8oZERTL0CD8=
)
[status] => 301
I think error is in URL
code call "host name/bucket name"
here I have change my bucketname
https://s3.amazonaws.com/pics.online.com/
it should call
https://pics.online.com.s3.amazonaws.com/
can you anyone tell me how to change this path for amazon s3 PHP?
try this
<?php
require 'aws-autoloader.php';
$credentials = new Aws\Credentials\Credentials('XXXXXXXXXXXXX', 'XXXXXXXXXXXXXX');
$bucket = "";
$s3 = \Aws\S3\S3Client::factory([
'signature' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
'http' => [
'verify' => '/home/ubuntu/cacert.pem'
],
'Statement' => [
'Action ' => "*",
],
// 'debug' => [
// 'logfn' => function ($msg) {
// echo $msg . "\n";
// },
// 'stream_size' => 0,
// 'scrub_auth' => true,
// 'http' => true,
// ]
]);
try {
$objects = $s3->getIterator('ListObjects', array(
'Bucket' => $bucket //bucket name
));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
}
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
download the sdk from AWS PHP SDK
UPDATE
First check you system for AWS Compatibility by using the following files.
https://github.com/amazonwebservices/aws-sdk-for-php/tree/master/_compatibility_test
you can use cacert.pem for fetching whole data from AWS S3.
you can download cacert.pem from here. and modify your php.ini by adding below code.
curl.cainfo = D:\xampp\php\cacert.pem
Here is the code I use on my local machine for fetching data without https. I have PHP V5.6 .
require 'aws-autoloader.php';
$s3 = \Aws\S3\S3Client::factory(array(
'credentials' => array(
'key' => '************',
'secret' => '*************',
),
'signature' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
));
print_r($result = $s3->listBuckets());
if you face errors in following format. you can use array format for initializing S3 object.
You can specify the bucket name using the set_hostname method...
$Connection->set_hostname($HOST); // Set the hostname
$Connection->allow_hostname_override(false); // Stop the hostname being changed later.
Your code should then read as follows...
define('AWS_KEY', 'xxxxxx');
define('AWS_SECRET_KEY', 'x+x/xxxxxxxx/');
define('AWS_CANONICAL_ID','xx');
define('AWS_CANONICAL_NAME', 'xxxxx');
$HOST = 's3.amazonaws.com';
require_once 'php_plugins/aws/v1/sdk.class.php';
$Connection = new AmazonS3(array(
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY,
'canonical_id' => AWS_CANONICAL_ID,
'canonical_name' => AWS_CANONICAL_NAME,
));
$Connection->set_hostname('pics.online.com.s3.amazonaws.com');
$Connection->allow_hostname_override(false);
$ListResponse = $Connection->list_buckets();
$Buckets = $ListResponse->body->Buckets->Bucket;
foreach ($Buckets as $Bucket) {
echo $Bucket->Name . "\t" . $Bucket->CreationDate . "\n";
$response = $Connection->list_objects($Bucket->Name);
}
I'm not sure which region is your buckets in. I thought you have multiple buckets in multiple regions. You can not call object related operation using S3Client for different region. It's only working for a specific region.
<?php
define('AWS_KEY', 'AKIAXXXXXXXXXXXXXXXX');
define('AWS_SECRET_KEY', 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX');
define('REGION', 'ap-southeast-1');
require_once 'aws-sdk-php-3.4.0/aws-autoloader.php';
$client = new Aws\S3\S3Client(array(
'credentials' => array(
'key' => AWS_KEY,
'secret' => AWS_SECRET_KEY,
),
'region' => REGION,
'version' => 'latest',
));
$listBuckets = $client->listBuckets();
foreach ($listBuckets['Buckets'] as $bucket) {
// Get bucket location
$location = $client->getBucketLocation(array(
'Bucket' => $bucket['Name']
));
// Print bucket information
echo $bucket['Name'] . "\t" . $bucket['CreationDate'] . "\t" . $location['LocationConstraint'] . "\n";
// Check if the bucket location is in client region
if ($location['LocationConstraint'] == $client->getRegion()) {
$listObjects = $client->listObjects(array(
'Bucket' => $bucket['Name']
));
foreach ($listObjects['Contents'] as $object) {
echo $object['Key'] . "\t" . $object['Size'] . "\n";
}
} else {
echo "--> The bucket is not in " . $client->getRegion() . ". Skipped.\n";
}
}
If you are working on multiple regions, you can create separated S3Client for each region.

Upload file on amazon S3 with PHP SDK

I'm trying to upload a picture on my amazon S3 via their PHP SDK. So I made a little script to do so. However, my script doesn't work and my exception doesn't send me back any error message.
I'm new with AWS thank you for your help.
Here is the code :
Config.php
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'PUBLICKEY',
'secret' => 'PRIVATEKEY',
'region' => 'eu-west-1'
)
)
)
);
?>
Index.php
<?php
//Installing AWS SDK via phar
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infact';
$keyname = 'myImage';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory('config.php');
// Upload a file.
try {
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ContentType' => 'text/plain',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
EDIT : I'm now using this code but its still not working. I don't even have error or exception message.
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'infactr';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = 'image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'key',
'secret' => 'privatekey',
'region' => 'eu-west-1'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filePath,
'ACL' => 'public-read',
'ContentType' => 'image/jpeg'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
Try something like this (from the AWS docs):
<?php
require 'aws.phar';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = '<your bucket name>';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/path/to/image.jpg';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'your AWS access key',
'secret' => 'your AWS secret access key'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
It works fine for me as long as you have the right credentials. Keep in mind that the key name is the name of your file in S3 so if you want to have your key have the same name of your file you have to do something like: $keyname = 'image.jpg'; . Also, a jpg is generally not a plain/text file type, you can ommit that Content-type field or you can just simply specify: image/jpeg
$s3 = S3Client::factory('config.php');
should be
$s3 = S3Client::factory(include 'config.php');
For those looking an up to date working version, this is what I am using
// Instantiate the client.
$s3 = S3Client::factory(array(
'credentials' => [
'key' => $s3Key,
'secret' => $s3Secret,
],
'region' => 'us-west-2',
'version' => "2006-03-01"
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $s3Bucket,
'Key' => $fileId,
'SourceFile' => $filepath."/".$fileName
));
return $result['ObjectURL'];
} catch (S3Exception $e) {
return false;
}
An alternative way to explain is by showing the curl, and how to build it in php - the pragmatic approach.
Please don't stone me for ugly code, just thought that this example is easy to follow for uploading to Azure from PHP, or other language.
$azure1 ='https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/';
$azure3 ='?sr=c&si=bucketnameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D';
$shellCmd='ls -la '.$outFileName;
$lsOutput=shell_exec($shellCmd);
#print_r($lsOutput);
$exploded=explode(' ', $lsOutput);
#print_r($exploded);
$fileLength=$exploded[7];
$curlAzure1="curl -v -X PUT -T '" . $outFileName . "' -H 'Content-Length: " . $fileLength . "' ";
$buildedCurlForUploading=$curlAzure1."'".$azure1.$outFileName.$azure3."'";
var_dump($buildedCurlForUploading);
shell_exec($buildedCurlForUploading);
This is the actual curl
shell_exec("curl -v -X PUT -T 'fileName' -H 'Content-Length: fileSize' 'https://viperprodstorage1.blob.core.windows.net/bucketnameAtAzure/fileName?sr=c&si=bucketNameAtAzure-policy&sig=GJ_verySecretHashFromAzure_aw%3D'")
Below are the code for upload image/file in amazon s3 bucket.
function upload_agreement_data($target_path, $source_path, $file_name, $content_type)
{
$fileup_flag = false;
/*------------- call global settings helper function starts ----------------*/
$bucketName = "pilateslogic";
//$global_setting_option = '__cloud_front_bucket__';
//$bucketName = get_global_settings($global_setting_option);
/*------------- call global settings helper function ends ----------------*/
if(!$bucketName)
{
die("ERROR: Template bucket name not found!");
}
// Amazon profile_template template js upload URL
$target_profile_template_js_url = "/".$bucketName."/".$target_path;
// Chatching profile_template template js upload URL
//$source_profile_template_js_url = dirname(dirname(dirname(__FILE__))).$source_path."/".$file_name;
// file name
$template_js_file = $file_name;
$this->s3->setEndpoint("s3-ap-southeast-2.amazonaws.com");
if($this->s3->putObjectFile($source_path, $target_profile_template_js_url, $template_js_file, S3::ACL_PUBLIC_READ, array(), array("Content-Type" => $content_type)))
{
$fileup_flag = true;
}
return $fileup_flag;
}

How to update object properties in a AWS S3 Bucket

I've uploaded nearly 25k files (large media files) to an s3 bucket. I used AWS SDK2 for PHP (S3Client::putObject) to perform uploads. Now, I need to update metadata for these files i.e change the ContentDisposition to attachment and assign a filename.
Is there a way to perform this without requiring to re-upload the file? Please help.
Yes, you can use the copyObject method, where you set the CopySource parameter equal to the Bucket and Key parameters.
Example:
// setup your $s3 connection, and define the bucket and key for your resource.
$s3->copyObject(array(
'Bucket' => $bucket,
'CopySource' => "$bucket/$key",
'Key' => $key,
'Metadata' => array(
'ExtraHeader' => 'HEADER VALUE'
),
'MetadataDirective' => 'REPLACE'
));
Update Cache Control Metadata on S3 Objects
<?php
define('S3_BUCKET', 'bucket-name');
define('S3_ACCESS_KEY', 'your-access-key');
define('S3_SECRET_KEY', 'secret-key');
define('S3_REGION', 'ap-south-1'); //Mumbai
require 'vendors/aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
try {
$s3 = S3Client::factory(array(
'version' => 'latest',
'region' => S3_REGION,
'credentials' => array(
'secret' => S3_SECRET_KEY,
'key' => S3_ACCESS_KEY,
)
));
$objects = $this->s3->getIterator('ListObjects', array('Bucket' => S3_BUCKET));
echo "Keys retrieved!\n";
foreach ($objects as $object) {
echo $object['Key'] . "\n";
$s3->copyObject(array(
'Bucket' => S3_BUCKET,
'CopySource' => S3_BUCKET . '/' . $object['Key'],
'Key' => $key,
'ContentType' => 'image/jpeg',
'ACL' => 'public-read',
'StorageClass' => 'REDUCED_REDUNDANCY',
'CacheControl' => 'max-age=172800',
'MetadataDirective' => 'REPLACE'
));
}
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
Try this
For delete existing object
$keyname = 'product-file/my-object1.dll';
try
{
$delete = $this->s3->deleteObject([
'Bucket' => 'belisc',
'Key' => $keyname
]);
if ($delete['DeleteMarker']){
return true;
} else {
return false;
}
}
catch (S3Exception $e) {
return $e->getAwsErrorMessage();
}
For check object
return true if object is still exists
$keyname = 'product-file/my-object1.dll';
try {
$this->s3->getObject([
'Bucket' => 'belisc',
'Key' => $keyname
]);
return true;
} catch (S3Exception $e) {
return $e->getAwsErrorMessage();
}
Then you can upload the new one
try {
return $this->s3->putObject([
'Bucket' => 'belisc',
'Key' => 'product-file/MiFlashSetup_eng.rar',
'SourceFile' => 'c:\MiFlashSetup_eng.rar'
]);
} catch (S3Exception $e) {
die("There was an error uploading the file. ".$e->getAwsErrorMessage());
}

Categories