PHP AWS AssumeRole with InstanceProfile throws Exception - php

Trying to get this code to work. We have it set up so we shouldn't need to read creds from a file. But it's still looking for them.
$provider = \Aws\Credentials\CredentialProvider::instanceProfile();
call_user_func( $provider )->wait();
$config = [
'profile' => 'default',
'region' => 'us-east-1',
'version' => '2011-06-15',
'credentials' => $provider,
'http' => [
'connect_timeout' => 30, // By default these wait indefinitely
'timeout' => 60,
]
];
try {
$stsClient = new StsClient($config);
$stsResult = $stsClient->assumeRole([
'RoleArn' => 'arn:aws:iam::1234:role/my-role',
'RoleSessionName' => 'MySession'
]);
} catch (\Exception $e) {
echo 'Caught exception: ', $e->getMessage(), "\n";
}
But instead of picking it up from the instance, it's throwing an exception:
Cannot read credentials from /home/user/.aws/credentials

Remove
'profile' => 'default',
From the config.Using this means that it will always try to read from the filesystem for the credentials.

Related

AWS S3Client resolves the wrong url when using getIterator with an IP on PHP

I am unable to use the getIterator function of S3Client due to it somehow reversing the url.
Instead of it looking for http://192.168.120.70/bucket it returns this:
Could not resolve host: bucket.192.168.120.70
I'm sure there is something simple I'm overlooking.
<?php
require '/Applications/MAMP/htdocs/lab/aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
$bucketName = 'bucket';
$IAM_KEY = 'MY-KEY';
$IAM_SECRET = 'MY-SECRET';
// Connect to AWS
try {
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'eu-west-1',
'endpoint' => 'http://192.168.120.70/',
'profile' => 'MY-PROFILE'
)
);
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}
$buckets = $s3->listBuckets();
foreach ($buckets['Buckets'] as $bucket) {
echo $bucket['Name'] . "\n";
}
// returns -> bucket
$obj = $s3->getIterator('ListObjects', array('Bucket' => 'bucket'));
foreach ($obj as $object) {
var_dump($object);
}
// Error -> Could not resolve host: bucket.192.168.120.70
?>
The full error :
Fatal error: Uncaught exception 'Aws\S3\Exception\S3Exception' with message 'Error executing
"ListObjects" on "http://bucket.192.168.120.70/?encoding-type=url";
AWS HTTP error: cURL error 6: Could not resolve host: bucket.192.168.120.70 (see
https://curl.haxx.se/libcurl/c/libcurl-errors.html)' GuzzleHttp\Exception\ConnectException: cURL
error 6: Could not resolve host: bucket.192.168.120.70
(see https://curl.haxx.se/libcurl/c/libcurl-errors.html) in
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php:200 Stack trace: #0
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(155):
GuzzleHttp\Handler\CurlFactory::createRejection(Object(GuzzleHttp\Handler\EasyHandle), Array) #1
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(105):
GuzzleHttp\Handler\CurlFactory::finishError(Object(GuzzleHttp\Handler\CurlMultiHandler),
Object(GuzzleHttp\Handler\EasyHandle), Object(GuzzleHttp\Handler\CurlFactory)) #2
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Han in
/Applications/MAMP/htdocs/lab/aws/Aws/WrappedHttpHandler.php on line 195
This is all you would need to accomplish your goal of listing buckets. The endpoint option is used for specific services and is not need in this scenerio:
$client = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET,
),
'region' => 'eu-west-1',
'version' => 'latest')
);
I seems it was an issue in my s3client creation, after this it worked.
Strangely with my previous code it was only the first bucket that wasn't working.
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'version' => 'latest',
'region' => 'Standard',
'endpoint' => 'http://192.167.120.60/', // Needed in my case
'use_path_style_endpoint' => true, // Needed in my case
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
)
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}

Amazon transcribe "Operation not found: StartTranscriptionJob"

Windows 7 Pro
PHP 7.0.2
AWS transcribe API 2017-10-26
Hi,
I'm trying to use Amazon Web Services to transcribe recordings from an IVR (Please leave your name after the tone. etc) using PHP. I can upload the recordings to my AWS bucket (so something is right) but when I try to start the transcription job I get the following error:
"Operation not found: StartTranscriptionJob"
I can get transcription to work using the AWS CLI so my system seems to be set up OK. There is not a lot online about this issue, I've done all the usual Googling and the info isn't very helpful - such as:
https://docs.aws.amazon.com/transcribe/latest/dg/API_StartTranscriptionJob.html
https://docs.aws.amazon.com/sdk-for-go/api/service/transcribeservice/#TranscribeService.StartTranscriptionJob
Here's my code, the StartTranscriptionJob is at the end:
<?php
require 'aws\aws-autoloader.php';
chdir('asr');
$logFile = 'asr.log';
$log = "\n\n".date("d/m/Y H:i:s");
###################
# get the recording
###################
$service = htmlspecialchars($_GET["service"]);
$recording = htmlspecialchars($_GET["recording"]);
$ftp_server = "ftp.****";
$ftp_username = "****";
$ftp_userpass = "****";
$ftp_conn = ftp_connect($ftp_server,7721) or die("Could not connect to $ftp_server");
ftp_login($ftp_conn, $ftp_username, $ftp_userpass);
ftp_chdir($ftp_conn,'Recordings');
ftp_pasv($ftp_conn, true);
ftp_chdir($ftp_conn,$service);
ftp_get ($ftp_conn , $recording , $recording, FTP_BINARY);
ftp_close($ftp_conn);
######################
# Upload to AWS bucket
######################
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$credentials = new Aws\Credentials\Credentials('****', '****');
$bucket = 'asr-bucket-test';
$keyname = $recording;
$s3 = new S3Client([
'profile' => 'default',
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => $credentials
]);
try {
// Upload data.
$result = $s3->putObject([
'Bucket' => $bucket,
'Key' => $keyname,
'SourceFile' => '\xampp\htdocs\****.co.uk\asr\\'.$recording,
'ACL' => 'public-read'
]);
// Print the URL to the object.
$log.= "\n".$result['ObjectURL'] . PHP_EOL;
} catch (S3Exception $e) {
$log.= "\n".$e->getMessage() . PHP_EOL;
}
############
# Transcribe
############
try {
$result = $s3->StartTranscriptionJob([
'LanguageCode' => 'en-US', // REQUIRED
'Media' => [
'MediaFileUri' => 'https://s3-us-west-2.amazonaws.com/asr-bucket-test/'.$recording,
],
'MediaFormat' => 'wav', // REQUIRED
'OutputBucketName' => $bucket,
'Settings' => [
# 'ChannelIdentification' => true || false,
'MaxSpeakerLabels' => 5,
# 'ShowSpeakerLabels' => true || false,
# 'VocabularyName' => $recording
],
'TranscriptionJobName' => 'test_job', // REQUIRED
]);
$log.="\n".$result;
} catch (Exception $e) {
$log.="\nError: Cannot start transcripion job " . $e->getMessage();
}
file_put_contents($logFile, $log,FILE_APPEND);
exit();
?>
You are trying to use S3Client to start the transcription:
$result = $s3->StartTranscriptionJob
Per the error message, S3 doesn't support the StartTranscriptionJob operation. You need to setup and use the TranscribeServiceClient.
You linked to the AWS SDK docs for Go, but your code is PHP. Relevant docs for PHP:
https://docs.aws.amazon.com/aws-sdk-php/v3/api/class-Aws.TranscribeService.TranscribeServiceClient.html
You have to create a new TranscribeServiceClient and then connect through that. Here's a snippet from my working code:
$transcribe = new Aws\TranscribeService\TranscribeServiceClient([
'region' => $region,
'version' => 'latest',
'credentials' => [
'key' => $accessKey,
'secret' => $secretKey,
],
]);
//print_r($transcribe);
$result = $transcribe->startTranscriptionJob([
'LanguageCode' => $languageCode, // REQUIRED
'Media' => [ // REQUIRED
'MediaFileUri' => $mediaURI,
],
'MediaFormat' => $fileExt, // REQUIRED
'OutputBucketName' => $bucketName,
'Settings' => [
'ChannelIdentification' => false,
'MaxSpeakerLabels' => 5,
'ShowSpeakerLabels' => true,
'VocabularyName' => '',
],
'TranscriptionJobName' => $fileName . "-" . date('Ymd-his') // REQUIRED
]);

PHP SoapClient Cannot process the message because the content type 'text/xml;

I cannot connect to webservice and send/receive data
Error
HTTP,Cannot process the message because the content type 'text/xml;
charset=utf-8' was not the expected type 'application/soap+xml;
charset=utf-8'.
Code
$parameters = [
'UserName' => 12324,
'Password' => 432123,
'Bill_Id' => 153585611140,
'Payment_Id' => 8560103,
];
$url="https://bill.samanepay.com/CheckBill/BillStateService.svc?wsdl";
$method = "VerifyBillPaymentWithAddData";
$client = new SoapClient($url);
try{
$info = $client->__call($method, array($parameters));
}catch (SoapFault $fault){
die($fault->faultcode.','.$fault->faultstring);
}
Notice : not work Soap version 1,1 and other resolve sample for this error in stackoverflow.
You could try
$url = "https://bill.samanepay.com/CheckBill/BillStateService.svc?wsdl";
try {
$client = new SoapClient($url, [
"soap_version" => SOAP_1_2, // SOAP_1_1
'cache_wsdl' => WSDL_CACHE_NONE, // WSDL_CACHE_MEMORY
'trace' => 1,
'exception' => 1,
'keep_alive' => false,
'connection_timeout' => 500000
]);
print_r($client->__getFunctions());
} catch (SOAPFault $f) {
error_log('ERROR => '.$f);
}
to verify that your method name is correct.
There you can see the method
VerifyBillPaymentWithAddDataResponse VerifyBillPaymentWithAddData(VerifyBillPaymentWithAddData $parameters)
Next is to check the Type VerifyBillPaymentWithAddData and if the parameter can be an array.
Also you could test to call the method via
$client->VerifyBillPaymentWithAddData([
'UserName' => 12324,
'Password' => 432123,
'Bill_Id' => 153585611140,
'Payment_Id' => 8560103,
]);
or yours except the additional array
$info = $client->__call($method, $parameters);
EDIT:
Assuming to https://stackoverflow.com/a/5409465/1152471 the error could be on the server side, because the server sends an header back that is not compatible with SOAP 1.2 standard.
Maybe you have to use an third party library or even simple sockets to get it working.
Just use the following function. Have fun!
function WebServices($function, $parameters){
$username = '***';
$password = '***';
$url = "http://*.*.*.*/*/*/*WebService.svc?wsdl";
$service_url = 'http://*.*.*.*/*/*/*WebService.svc';
$client = new SoapClient($url, [
"soap_version" => SOAP_1_2,
"UserName"=>$username,
"Password"=>$password,
"SOAPAction"=>"http://tempuri.org/I*WebService/$function",
'cache_wsdl' => WSDL_CACHE_NONE, // WSDL_CACHE_MEMORY
'trace' => 1,
'exception' => 1,
'keep_alive' => false,
'connection_timeout' => 500000
]);
$action = new \SoapHeader('http://www.w3.org/2005/08/addressing', 'Action', "http://tempuri.org/I*WebService/$function");
$to = new \SoapHeader('http://www.w3.org/2005/08/addressing', 'To', $service_url);
$client->__setSoapHeaders([$action, $to]);
try{
return $client->__call($function, $parameters);
} catch(SoapFault $e){
return $e->getMessage();
}
}

AWS-CloudWatch: InvalidSequenceTokenException

I have a php worker where i log events to AWS could watch.
Unfortunately i got the following error when i try to submit it.
InvalidSequenceTokenException Error executing "PutLogEvents" on
"https://logs.eu-west-1.amazonaws.com"; AWS HTTP error: Client error:
POST https://logs.eu-west-1.amazonaws.com resulted in a 400 Bad
Request response:
{"__type":"InvalidSequenceTokenException","expectedSequenceToken":"999999999999990356407851919528174
(truncated...) InvalidSequenceTokenException (client): The given
sequenceToken is invalid. The next expected sequenceToken is:
495599999999988500356407851919528174642 -
{"__type":"InvalidSequenceTokenException","expectedSequenceToken":"495573099999999900356407851919528174642","message":"The
given sequenceToken is invalid. The next expected sequenceToken is:
495579999999900356407851919528174642"}
and this is my code
$date = new DateTime();
$instance= = new CloudWatchLogsClient([
'region' => 'eu-west-1',
'version' => 'latest',
'credentials' => [
'key' => 'XXX',
'secret' => 'XXXX'
]
]);
$instance->putLogEvents([
'logGroupName' => "WorkerLog",
'logStreamName' => "log",
'logEvents' => [
[
'timestamp' => $date->getTimestamp(),
'message' => "test log"
]
]
]);
http://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/API_PutLogEvents.html
You must include a sequence token with your request. If you don't have one you must use describeLogStreams (http://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/API_DescribeLogStreams.html) to get the stream sequence.
When you make a call to putLogEvents you will get the nextToken in the response. You also must be ready for the case in which someone else pushes to the stream and invalidates the nextToken. (in this case you need to describe the stream again to get an updated token).
The DescribeLogStreams does not support the same volume call as PutLogEvent. You may got throttled if calling it frequently.
The recommended way to do it is calling the PutLogEvents directly and catch the InvalidSequenceTokenException. Then retry the PutLogEvents with Sequence Token in the InvalidSequenceTokenException message.
Correct sequence token can be found in the expectedSequenceToken field of InvalidSequenceTokenException
try {
$result = $client->describeLogStreams([
'logGroupName' => $logGroupName,
'logStreamNamePrefix' => $logStreamName,
]);
$uploadSequenceToken = $logStreams[0]['uploadSequenceToken'];
$client->putLogEvents([
'logGroupName' => $logGroupName,
'logStreamName' => $logStreamName,
'logEvents' => [
[
'timestamp' => $timestamp,
'message' => $message
],
],
'sequenceToken' => $uploadSequenceToken,
]);
} catch (\InvalidSequenceTokenException $e) {
$client->putLogEvents([
'logGroupName' => $logGroupName,
'logStreamName' => $logStreamName,
'logEvents' => [
[
'timestamp' => $timestamp,
'message' => $message
],
],
'sequenceToken' => $e->expectedSequenceToken,
]);
}
This is my working solution: before send new putLogEvents you must take the last uploadSequenceToken.
try {
$client = \Aws\CloudWatchLogs\CloudWatchLogsClient::factory($configCloudWatch);
$logStreamName = 'testLogStream';
$logGroupName = 'testGroupName';
$result = $client->describeLogStreams([
'logGroupName' => $logGroupName,
'logStreamNamePrefix' => $logStreamName,
]);
$logStreams=$result->get('logStreams');
if (!$logStreams)
throw new \Exception('No log stream found');
if (count($logStreams)!=1)
throw new \Exception('Multiple log stream found');
$uploadSequenceToken = $logStreams[0]['uploadSequenceToken'];
$client->putLogEvents([
'logGroupName' => $logGroupName,
'logStreamName' => $logStreamName,
'logEvents' => [
[
'timestamp' => round(microtime(true) * 1000),
// message is required
'message' => json_encode([ ... ]
),
],
],
'sequenceToken' => $uploadSequenceToken,
]);
} catch (\Exception $e) {
\Log::error(__METHOD__, ['exception' => $e]);
}

AWS S3 access denied when getting image by url

I am working on AWS EC2 Ubuntu Machine and trying to fetch image from AWS S3 but following error has been shown to me every time.
<Error>
<Code>InvalidArgument</Code>
<Message>
Requests specifying Server Side Encryption with AWS KMS managed keys require AWS Signature Version 4.
</Message>
<ArgumentName>Authorization</ArgumentName>
<ArgumentValue>null</ArgumentValue>
<RequestId>7C8B4BF1CE2FDC9E</RequestId>
<HostId>
/L5kjuOET4XFgGter2eFHX+aRSvVm/7VVmIBqQE/oMLeQZ1ditSMZuHPOlsMaKi8hYRnGilTqZY=
</HostId>
</Error>
Here is my bucket policy
{
"Version": "2012-10-17",
"Id": "Policy1441213815928",
"Statement": [
{
"Sid": "Stmt1441213813464",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::mytest.sample/*"
}
]
}
Here is the code
require 'aws-autoloader.php';
$credentials = new Aws\Credentials\Credentials('key', 'key');
$bucketName = "mytest.sample";
$s3 = new Aws\S3\S3Client([
'signature' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
'http' => [
'verify' => '/home/ubuntu/cacert.pem'
],
'Statement' => [
'Action ' => "*",
],
]);
$result = $s3->getObject(array(
'Bucket' => $bucketName,
'Key' => 'about_us.jpg',
));
Html
<img src="<?php echo $result['#metadata']['effectiveUri']; ?>" />
Edit for Michael - sqlbot : here I am using default KMS.
try {
$result = $this->Amazon->S3->putObject(array(
'Bucket' => 'mytest.sample',
'ACL' => 'authenticated-read',
'Key' => $newfilename,
'ServerSideEncryption' => 'aws:kms',
'SourceFile' => $filepath,
'ContentType' => mime_content_type($filepath),
'debug' => [
'logfn' => function ($msg) {
echo $msg . "\n";
},
'stream_size' => 0,
'scrub_auth' => true,
'http' => true,
],
));
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
let me know if you need more.
PHP sdk v2
the Credentials package is Aws\Common\Credentials
to create an S3Client you need a factory
Try something like this
use Aws\S3\S3Client;
use Aws\Common\Credentials\Credentials;
$credentials = new Credentials('YOUR_ACCESS_KEY', 'YOUR_SECRET_KEY');
// Instantiate the S3 client with your AWS credentials
$s3Client = S3Client::factory(array(
'signature' => 'v4',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
.....
]);
)
If that does not work you might try to declare explicitly a SignatureV4 object
use Aws\S3\S3Client;
use Aws\Common\Credentials\Credentials;
use Aws\Common\Signature\SignatureV4;
$credentials = new Credentials('YOUR_ACCESS_KEY', 'YOUR_SECRET_KEY');
// Instantiate the S3 client with your AWS credentials
$s3Client = S3Client::factory(array(
'signature' => new SignatureV4(),
'region' => 'ap-southeast-1',
'credentials' => $credentials,
.....
]);
)
In case you upgrade to sdk v3
You need to have signature_version (instead of signature) as parameter when you declare your s3 client
Statement does not appear to be a valid parameter (http://docs.aws.amazon.com/aws-sdk-php/v3/guide/guide/configuration.html#signature-version)
if issue you can turn on debug param to get more output
This would look like this
$s3 = new Aws\S3\S3Client([
'signature_version' => 'v4',
'version' => 'latest',
'region' => 'ap-southeast-1',
'credentials' => $credentials,
'http' => [
'verify' => '/home/ubuntu/cacert.pem'
],
'debug' => true
]);
see here for the full list of available parameter
I have also face this issue with aws:kms encyrption key, I suggest that if you wanted to use kms key then you have to create your kms key in IAM section of AWS Console. I love to recommended AES256 server side encryption, here S3 automatically Encrypted your data while putting and decryption while getting object. Please go through below link:
S3 Server Side encryption with AES256
My Solution is change this line 'ServerSideEncryption' => 'aws:kms' with 'ServerSideEncryption' => 'AES256'
try {
$result = $this->Amazon->S3->putObject(array(
'Bucket' => 'mytest.sample',
'ACL' => 'authenticated-read',
'Key' => $newfilename,
'ServerSideEncryption' => 'AES256',
'SourceFile' => $filepath,
'ContentType' => mime_content_type($filepath),
'debug' => [
'logfn' => function ($msg) {
echo $msg . "\n";
},
'stream_size' => 0,
'scrub_auth' => true,
'http' => true,
],
));
} catch (S3Exception $e) {
echo $e->getMessage() . "\n";
}
Please also update your bucket policy with below json, it will prevent you to upload object with out AES256 encryption
{
"Sid": "DenyUnEncryptedObjectUploads",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::yourbucketname/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "AES256"
}
}
}

Categories