AWS S3 access denied when getting image by url - php

I am working on AWS EC2 Ubuntu Machine and trying to fetch image from AWS S3 but following error has been shown to me every time.
<Error>
<Code>InvalidArgument</Code>
<Message>
Requests specifying Server Side Encryption with AWS KMS managed keys require AWS Signature Version 4.
</Message>
<ArgumentName>Authorization</ArgumentName>
<ArgumentValue>null</ArgumentValue>
<RequestId>7C8B4BF1CE2FDC9E</RequestId>
<HostId>
/L5kjuOET4XFgGter2eFHX+aRSvVm/7VVmIBqQE/oMLeQZ1ditSMZuHPOlsMaKi8hYRnGilTqZY=
</HostId>
</Error>
Here is my bucket policy
{
"Version": "2012-10-17",
"Id": "Policy1441213815928",
"Statement": [
{
"Sid": "Stmt1441213813464",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::mytest.sample/*"
}
]
}
Here is the code
require 'aws-autoloader.php';
$credentials = new Aws\Credentials\Credentials('key', 'key');
$bucketName = "mytest.sample";
$s3 = new Aws\S3\S3Client([
'signature' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
'http' => [
'verify' => '/home/ubuntu/cacert.pem'
],
'Statement' => [
'Action ' => "*",
],
]);
$result = $s3->getObject(array(
'Bucket' => $bucketName,
'Key' => 'about_us.jpg',
));
Html
<img src="<?php echo $result['#metadata']['effectiveUri']; ?>" />
Edit for Michael - sqlbot : here I am using default KMS.
try {
$result = $this->Amazon->S3->putObject(array(
'Bucket' => 'mytest.sample',
'ACL' => 'authenticated-read',
'Key' => $newfilename,
'ServerSideEncryption' => 'aws:kms',
'SourceFile' => $filepath,
'ContentType' => mime_content_type($filepath),
'debug' => [
'logfn' => function ($msg) {
echo $msg . "\n";
},
'stream_size' => 0,
'scrub_auth' => true,
'http' => true,
],
));
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
let me know if you need more.

PHP sdk v2
the Credentials package is Aws\Common\Credentials
to create an S3Client you need a factory
Try something like this
use Aws\S3\S3Client;
use Aws\Common\Credentials\Credentials;
$credentials = new Credentials('YOUR_ACCESS_KEY', 'YOUR_SECRET_KEY');
// Instantiate the S3 client with your AWS credentials
$s3Client = S3Client::factory(array(
'signature' => 'v4',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
.....
]);
)
If that does not work you might try to declare explicitly a SignatureV4 object
use Aws\S3\S3Client;
use Aws\Common\Credentials\Credentials;
use Aws\Common\Signature\SignatureV4;
$credentials = new Credentials('YOUR_ACCESS_KEY', 'YOUR_SECRET_KEY');
// Instantiate the S3 client with your AWS credentials
$s3Client = S3Client::factory(array(
'signature' => new SignatureV4(),
'region' => 'ap-southeast-1',
'credentials' => $credentials,
.....
]);
)
In case you upgrade to sdk v3
You need to have signature_version (instead of signature) as parameter when you declare your s3 client
Statement does not appear to be a valid parameter (http://docs.aws.amazon.com/aws-sdk-php/v3/guide/guide/configuration.html#signature-version)
if issue you can turn on debug param to get more output
This would look like this
$s3 = new Aws\S3\S3Client([
'signature_version' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
'http' => [
'verify' => '/home/ubuntu/cacert.pem'
],
'debug' => true
]);
see here for the full list of available parameter

I have also face this issue with aws:kms encyrption key, I suggest that if you wanted to use kms key then you have to create your kms key in IAM section of AWS Console. I love to recommended AES256 server side encryption, here S3 automatically Encrypted your data while putting and decryption while getting object. Please go through below link:
S3 Server Side encryption with AES256
My Solution is change this line 'ServerSideEncryption' => 'aws:kms' with 'ServerSideEncryption' => 'AES256'
try {
$result = $this->Amazon->S3->putObject(array(
'Bucket' => 'mytest.sample',
'ACL' => 'authenticated-read',
'Key' => $newfilename,
'ServerSideEncryption' => 'AES256',
'SourceFile' => $filepath,
'ContentType' => mime_content_type($filepath),
'debug' => [
'logfn' => function ($msg) {
echo $msg . "\n";
},
'stream_size' => 0,
'scrub_auth' => true,
'http' => true,
],
));
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
Please also update your bucket policy with below json, it will prevent you to upload object with out AES256 encryption
{
"Sid": "DenyUnEncryptedObjectUploads",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::yourbucketname/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "AES256"
}
}
}

Related

Amazon transcribe "Operation not found: StartTranscriptionJob"

Windows 7 Pro
PHP 7.0.2
AWS transcribe API 2017-10-26
Hi,
I'm trying to use Amazon Web Services to transcribe recordings from an IVR (Please leave your name after the tone. etc) using PHP. I can upload the recordings to my AWS bucket (so something is right) but when I try to start the transcription job I get the following error:
"Operation not found: StartTranscriptionJob"
I can get transcription to work using the AWS CLI so my system seems to be set up OK. There is not a lot online about this issue, I've done all the usual Googling and the info isn't very helpful - such as:
https://docs.aws.amazon.com/transcribe/latest/dg/API_StartTranscriptionJob.html
https://docs.aws.amazon.com/sdk-for-go/api/service/transcribeservice/#TranscribeService.StartTranscriptionJob
Here's my code, the StartTranscriptionJob is at the end:
<?php
require 'aws\aws-autoloader.php';
chdir('asr');
$logFile = 'asr.log';
$log = "\n\n".date("d/m/Y H:i:s");
###################
# get the recording
###################
$service = htmlspecialchars($_GET["service"]);
$recording = htmlspecialchars($_GET["recording"]);
$ftp_server = "ftp.****";
$ftp_username = "****";
$ftp_userpass = "****";
$ftp_conn = ftp_connect($ftp_server,7721) or die("Could not connect to $ftp_server");
ftp_login($ftp_conn, $ftp_username, $ftp_userpass);
ftp_chdir($ftp_conn,'Recordings');
ftp_pasv($ftp_conn, true);
ftp_chdir($ftp_conn,$service);
ftp_get ($ftp_conn , $recording , $recording, FTP_BINARY);
ftp_close($ftp_conn);
######################
# Upload to AWS bucket
######################
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$credentials = new Aws\Credentials\Credentials('****', '****');
$bucket = 'asr-bucket-test';
$keyname = $recording;
$s3 = new S3Client([
'profile' => 'default',
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => $credentials
]);
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => '\xampp\htdocs\****.co.uk\asr\\'.$recording,
'ACL' => 'public-read'
]);
// Print the URL to the object.
$log.= "\n".$result['ObjectURL'] . PHP_EOL;
} catch (S3Exception $e) {
$log.= "\n".$e->getMessage() . PHP_EOL;
}
############
# Transcribe
############
try {
$result = $s3->StartTranscriptionJob([
'LanguageCode' => 'en-US', // REQUIRED
'Media' => [
'MediaFileUri' => 'https://s3-us-west-2.amazonaws.com/asr-bucket-test/'.$recording,
],
'MediaFormat' => 'wav', // REQUIRED
'OutputBucketName' => $bucket,
'Settings' => [
# 'ChannelIdentification' => true || false,
'MaxSpeakerLabels' => 5,
# 'ShowSpeakerLabels' => true || false,
# 'VocabularyName' => $recording
],
'TranscriptionJobName' => 'test_job', // REQUIRED
]);
$log.="\n".$result;
} catch (Exception $e) {
$log.="\nError: Cannot start transcripion job " . $e->getMessage();
}
file_put_contents($logFile, $log,FILE_APPEND);
exit();
?>
You are trying to use S3Client to start the transcription:
$result = $s3->StartTranscriptionJob
Per the error message, S3 doesn't support the StartTranscriptionJob operation. You need to setup and use the TranscribeServiceClient.
You linked to the AWS SDK docs for Go, but your code is PHP. Relevant docs for PHP:
https://docs.aws.amazon.com/aws-sdk-php/v3/api/class-Aws.TranscribeService.TranscribeServiceClient.html
You have to create a new TranscribeServiceClient and then connect through that. Here's a snippet from my working code:
$transcribe = new Aws\TranscribeService\TranscribeServiceClient([
'region' => $region,
'version' => 'latest',
'credentials' => [
'key' => $accessKey,
'secret' => $secretKey,
],
]);
//print_r($transcribe);
$result = $transcribe->startTranscriptionJob([
'LanguageCode' => $languageCode, // REQUIRED
'Media' => [ // REQUIRED
'MediaFileUri' => $mediaURI,
],
'MediaFormat' => $fileExt, // REQUIRED
'OutputBucketName' => $bucketName,
'Settings' => [
'ChannelIdentification' => false,
'MaxSpeakerLabels' => 5,
'ShowSpeakerLabels' => true,
'VocabularyName' => '',
],
'TranscriptionJobName' => $fileName . "-" . date('Ymd-his') // REQUIRED
]);

AWS SDK for PHP - Fatal Error Issue

I am trying to connect to my S3 to upload a file via my server but whenever i try to run the PHP, i encounter the following error below. I included the Version and Region but yet the issue still stands?
Error:
Fatal error: Uncaught exception 'InvalidArgumentException' with message 'Missing required client configuration options: region: (string) A "region" configuration value is required for the "s3" service (e.g., "us-west-2"). A list of available public regions and endpoints can be found at http://docs.aws.amazon.com/general/latest/gr/rande.html. version: (string) A "version" configuration value is required. Specifying a version constraint ensures that your code will not be affected by a breaking change made to the service. For example, when using Amazon S3, you can lock your API version to "2006-03-01". Your build of the SDK has the following version(s) of "s3": * "2006-03-01" You may provide "latest" to the "version" configuration value to utilize the most recent available API version that your client's API provider can find. Note: Using 'latest' in a production application is not recommended. A list of available API versions can be found on each client's API documentation page: http:/ in /srv/http/auploader/include/Aws/ClientResolver.php on line 364
My Code:
<?PHP
require '/srv/http/test/include/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$bucket = 'testbucket';
$keyname = 'sample';
// $filepath should be absolute path to a file on disk
$filepath = '/srv/http/testfile/setup.html';
// Instantiate the client.
$s3 = S3Client::factory(array(
'key' => 'blank',
'secret' => 'blank'
));
try {
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => $filepath,
'ACL' => 'public-read',
'Region' => 'eu-west-1',
'Version' => '2006-03-01'
));
// Print the URL to the object.
echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
?>
You have to create an object of S3. And keys you have put is misplaced please do it as following.
$s3 = S3Client::factory([
'version' => 'latest',
'region' => 'eu-west-1',
'credentials' => [
'key' => "your s3 bucket key",
'secret' => "your s3 bucket secret key",
]
]);
By using s3 object you can implement putObject method something like this.
$result = $s3->putObject(array(
'Bucket' => "yourbucket name",
'Key' => $keyName,
'SourceFile' => $filepath,
'ACL' => 'public-read', //for making the public url
'Version' => '2006-03-01'
));
));
Hope it helps!
For SES AWS SDK v3 use
/*
* 1. version as `2010-12-01`
* 2. version as Eg. `us-east-1`.
*/
ini_set("display_errors", 1);
Aws\Ses\SesClient::factory(array(
'credentials' => array(
'key' => "someKey",
'secret' => "someSecret",
),
"region" => "us-east-1",
"version" => "2010-12-01")
);

unable to upload file to sub-folder of main bucket

I am trying to upload error file in AWSS3 but it shows error like "The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint: "test9011960909.s3.amazonaws.com"."
i also specified 'region' => 'us-east-1' but still same error occurs.
it is working when i specify
'Bucket' => $this->bucket,
but i wanted to upload file in sub-folder of main bucket
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
i already applied approved answer from AWS S3: The bucket you are attempting to access must be addressed using the specified endpoint
but still getting same error, i am using php
Code :
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
class AWSS3Factory {
private $bucket;
private $keyname;
public function __construct() {
$this->bucket = AWSS3_BucketName;
$this->keyname = AWSS3_AccessKey;
// Instantiate the client.
}
public function UploadFile($FullFilePath,$PrefixFolderPath="") {
try {
$s3 = S3Client::factory(array(
'credentials' => array(
'key' => MYKEY,
'secret' => MYSECKEY,
'region' => 'eu-west-1',
)
));
// Upload data.
$result = $s3->putObject(array(
'Bucket' => $this->bucket . "/" . $PrefixFolderPath,
'Key' => $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));
return true;
// Print the URL to the object.
//echo $result['ObjectURL'] . "\n";
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
}
}
You must create s3 instance in another way, like this:
$s3 = S3Client::factory([
'region' => '',
'credentials' => ['key' => '***', 'secret' => '***'],
'version' => 'latest',
]);
You must add $PrefixFolderPath not to 'Bucket' but to 'Key':
$result = $s3->putObject(array(
'Bucket' => $this->bucket,
'Key' => $PrefixFolderPath . "/" . $this->keyname,
'SourceFile' => $FullFilePath,
'StorageClass' => 'REDUCED_REDUNDANCY'
));

PHP Amazon SDK, S3 Bucket Access Denied

I try for the first time to use the PHP AWS SDK ("aws/aws-sdk-php": "^3.19") to use S3.
I created a bucket : 'myfirstbucket-jeremyc'
I created a policy :
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::myfirstbucket-jeremyc/*"
]
}
]
}
I applied the policy to a group and then created a user 's3-myfirstbucket-jeremyc' in this group.
My PHP code is :
<?php
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
error_reporting(E_ALL);
require(__DIR__ . '/vendor/autoload.php');
$s3Client = S3Client::factory([
'credentials' => [
'key' => $_SERVER['AWS_S3_CLIENT_KEY'],
'secret' => $_SERVER['AWS_S3_CLIENT_SECRET']
],
'region' => 'eu-west-1',
'version' => 'latest',
'scheme' => 'http'
]);
$result = $s3Client->putObject(array(
'Bucket' => 'myfirstbucket-jeremyc',
'Key' => 'text.txt',
'Body' => 'Hello, world!',
'ACL' => 'public-read'
));
But i get this error :
Error executing "PutObject" on
"http://s3-eu-west-1.amazonaws.com/myfirstbucket-jeremyc/text.txt";
AWS HTTP error: Client error: PUT
http://s3-eu-west-1.amazonaws.com/myfirstbucket-jeremyc/text.txt
resulted in a 403 Forbidden response
Do you know where i'm wrong ?
Thanks in advance !
You're setting the ACL for the new object but you haven't allowed s3:PutObjectAcl.

ACL not applying during AWS s3 folder upload (uploadDirectory)

For some reason public-read is not being applied when I'm uploading a folder to an S3 bucket. (IE, public can not access the files)
The files upload fine, but they are all set to private. Tried everything I can think of. Feels like I'm missing something basic.
Was using this guide:
https://blogs.aws.amazon.com/php/post/Tx2W9JAA7RXVOXA/Syncing-Data-with-Amazon-S3
Here is my code:
require '../vendor/autoload.php';
use Aws\S3\S3Client;
$client = S3Client::factory(array(
'version' => '2006-03-01',
'region' => 'ap-southeast-2',
'credentials' => array(
'key' => 'MYKEY',
'secret' => 'MYSECRET',
)
));
$dir = 'assets';
$bucket = 'gittestbucket';
$keyPrefix = 'assets';
$options = array(
'params' => array('ACL' => 'public-read'),
'concurrency' => 20,
'debug' => true
);
$UploadAWS = $client->uploadDirectory($dir, $bucket, $keyPrefix, $options);
var_dump($UploadAWS);
My IAM user policy (also has a group of list all buckets):
{
"Statement": [
{
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::gittestbucket",
"arn:aws:s3:::gittestbucket/*",
]
}
]
}
Any help much appreciated. Cheers
I struggled with this a while back.
Try changing you upload statement to this one bellow
$UploadAWS = $client->uploadDirectory($dir, $bucket, $keyPrefix, array(
'concurrency' => 20,
'debug' => true,
'before' => function (\Aws\Command $command) {
$command['ACL'] = strpos($command['Key'], 'CONFIDENTIAL') === false
? 'public-read'
: 'private';
}
));
AWS is shocking sometimes for its documentation as it changes so much

Categories