Paginating List Cognito Identities in AWS PHP SDK v3 - php

How do I fetch all records when the result is paginated, using the AWS PHP SDK v3? I have the following code:
require_once 'vendor/autoload.php';
$cognitoIdentityClient = new Aws\CognitoIdentity\CognitoIdentityClient([
'region' => 'eu-west-1',
'version' => '2014-06-30',
'credentials' => [
'key' => '**************',
'secret' => '***************',
],
]);
$identities = $cognitoIdentityClient->getPaginator('ListIdentities', [
'IdentityPoolId' => 'eu-west-1:****************************',
]);
which looks like it should work, but produces the error:
Fatal error: Uncaught UnexpectedValueException: There is no ListIdentities paginator defined for the cognito-identity service. in /path/to/vendor/aws/aws-sdk-php/src/Api/Service.php:363
Stack trace:
#0 /path/to/vendor/aws/aws-sdk-php/src/AwsClientTrait.php(23): Aws\Api\Service->getPaginatorConfig('ListIdentities')
#1 /path/to/report.php(24): Aws\AwsClient->getIterator('ListIdentities', Array)
#2 {main}
thrown in /path/to/vendor/aws/aws-sdk-php/src/Api/Service.php on line 363
The getPaginator method exists, but the file data/cognito-identity/2014-06-30/paginators-1.json.php is blank, so no paginators are implemented. I see NextToken in the response, but don't understand the pattern to seamlessly load more results (do (...) {} while (...)?)

I solved it like this:
$identities = [];
$i = $cognitoIdentityClient->listIdentities([
'IdentityPoolId' => IDENTITYPOOLID,
'MaxResults' => 60,
]);
$identities = array_merge($identities, $i->get('Identities'));
while ($nextToken = $i->get('NextToken')) {
$i = $cognitoIdentityClient->listIdentities([
'IdentityPoolId' => IDENTITYPOOLID,
'MaxResults' => 60,
'NextToken' => $nextToken,
]);
$identities = array_merge($identities, $i->get('Identities'));
}

Related

I am not able to get signed url in cloudfront, getting fatal error, code I am trying is below

<?php
require '../aws-autoloader.php';
use Aws\CloudFront\CloudFrontClient;
use Aws\Exception\AwsException;
// Create a CloudFront Client
$client = new Aws\CloudFront\CloudFrontClient([
'profile' => 'default',
'version' => 'latest',
'region' => 'us-east-1',
]);
// Set up parameter values for the resource
$resourceKey = 'https://example.cloudfront.net/b20cbfe5-a8df-47a5-94c4-aeadea20759f/dash/videoplayback.mpd';
$expires = time() + 300;
// Create a signed URL for the resource using the canned policy
$signedUrlCannedPolicy = $client->getSignedUrl([
'url' => $resourceKey,
'expires' => $expires,
'private_key' => 'pk.pem',
'key_pair_id' => 'keyid'
]);
getting error as
Fatal error: Uncaught InvalidArgumentException: error:0906D06C:PEM routines:PEM_read_bio:no start line in C:\xampp\htdocs\aws\Aws\CloudFront\Signer.php:40 Stack trace: #0 C:\xampp\htdocs\aws\Aws\CloudFront\UrlSigner.php(24): Aws\CloudFront\Signer->__construct('APKAJYH2L6BGHLW...', 'pk-APKAJYH2L6BG...') #1 C:\xampp\htdocs\aws\Aws\CloudFront\CloudFrontClient.php(138): Aws\CloudFront\UrlSigner->__construct('APKAJYH2L6BGHLW...', 'pk-APKAJYH2L6BG...') #2 C:\xampp\htdocs\aws\app\stream.php(26): Aws\CloudFront\CloudFrontClient->getSignedUrl(Array) #3 {main} thrown in C:\xampp\htdocs\aws\Aws\CloudFront\Signer.php on line 40
I have resolved this issue. The issue was you need to give absolute path in below like this
'private_key' => $_SERVER['DOCUMENT_ROOT'] . '/' . 'pk.pem',
Let me know if it's working or not?

AWS S3Client resolves the wrong url when using getIterator with an IP on PHP

I am unable to use the getIterator function of S3Client due to it somehow reversing the url.
Instead of it looking for http://192.168.120.70/bucket it returns this:
Could not resolve host: bucket.192.168.120.70
I'm sure there is something simple I'm overlooking.
<?php
require '/Applications/MAMP/htdocs/lab/aws/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
$bucketName = 'bucket';
$IAM_KEY = 'MY-KEY';
$IAM_SECRET = 'MY-SECRET';
// Connect to AWS
try {
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'eu-west-1',
'endpoint' => 'http://192.168.120.70/',
'profile' => 'MY-PROFILE'
)
);
} catch (Exception $e) {
die("Error: " . $e->getMessage());
}
$buckets = $s3->listBuckets();
foreach ($buckets['Buckets'] as $bucket) {
echo $bucket['Name'] . "\n";
}
// returns -> bucket
$obj = $s3->getIterator('ListObjects', array('Bucket' => 'bucket'));
foreach ($obj as $object) {
var_dump($object);
}
// Error -> Could not resolve host: bucket.192.168.120.70
?>
The full error :
Fatal error: Uncaught exception 'Aws\S3\Exception\S3Exception' with message 'Error executing
"ListObjects" on "http://bucket.192.168.120.70/?encoding-type=url";
AWS HTTP error: cURL error 6: Could not resolve host: bucket.192.168.120.70 (see
https://curl.haxx.se/libcurl/c/libcurl-errors.html)' GuzzleHttp\Exception\ConnectException: cURL
error 6: Could not resolve host: bucket.192.168.120.70
(see https://curl.haxx.se/libcurl/c/libcurl-errors.html) in
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php:200 Stack trace: #0
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(155):
GuzzleHttp\Handler\CurlFactory::createRejection(Object(GuzzleHttp\Handler\EasyHandle), Array) #1
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Handler/CurlFactory.php(105):
GuzzleHttp\Handler\CurlFactory::finishError(Object(GuzzleHttp\Handler\CurlMultiHandler),
Object(GuzzleHttp\Handler\EasyHandle), Object(GuzzleHttp\Handler\CurlFactory)) #2
/Applications/MAMP/htdocs/lab/aws/GuzzleHttp/Han in
/Applications/MAMP/htdocs/lab/aws/Aws/WrappedHttpHandler.php on line 195
This is all you would need to accomplish your goal of listing buckets. The endpoint option is used for specific services and is not need in this scenerio:
$client = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET,
),
'region' => 'eu-west-1',
'version' => 'latest')
);
I seems it was an issue in my s3client creation, after this it worked.
Strangely with my previous code it was only the first bucket that wasn't working.
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation.
$s3 = S3Client::factory(
array(
'version' => 'latest',
'region' => 'Standard',
'endpoint' => 'http://192.167.120.60/', // Needed in my case
'use_path_style_endpoint' => true, // Needed in my case
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
)
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}

Fatal error: Uncaught Elasticsearch\Common\Exceptions\BadRequest400Exception

So I am using elasticsearch.
I have this code:
<?php
error_reporting(E_ALL);ini_set('display_errors', 1);
require 'vendor/autoload.php';
use Elasticsearch\ClientBuilder;
$hosts = [
'http://localhost:80', // SSL to localhost
];
$clientBuilder = ClientBuilder::create(); // Instantiate a new ClientBuilder
$clientBuilder->setHosts($hosts); // Set the hosts
$client = $clientBuilder->build();
$params = [
'index' => 'my_index',
'type' => 'my_type',
'id' => 'my_id',
'body' => ['testField' => 'abc']
];
$response = $client->index($params);
print_r($response);
I get this error:
Fatal error: Uncaught Elasticsearch\Common\Exceptions\BadRequest400Exception: 405 Method Not Allowed Method Not Allowed The requested method PUT is not allowed for the URL /my_index/my_type/my_id. in C:\Bitnami\wampstack-7.0.0RC7-\apache2\htdocs\vendor\elasticsearch\elasticsearch\src\Elasticsearch\Connections\Connection.php:615 Stack trace: #0 C:\Bitnami\wampstack-7.0.0RC7-\apache2\htdocs\vendor\elasticsearch\elasticsearch\src\Elasticsearch\Connections\Connection.php(279): Elasticsearch\Connections\Connection->process4xxError(Array, Array, Array) #1 C:\Bitnami\wampstack-7.0.0RC7-\apache2\htdocs\vendor\react\promise\src\FulfilledPromise.php(25): Elasticsearch\Connections\Connection->Elasticsearch\Connections{closure}(Array) #2 C:\Bitnami\wampstack-7.0.0RC7-\apache2\htdocs\vendor\guzzlehttp\ringphp\src\Future\CompletedFutureValue.php(55): React\Promise\FulfilledPromise->then(Object(Closure), NULL, NU in C:\Bitnami\wampstack-7.0.0RC7-\apache2\htdocs\vendor\elasticsearch\elasticsearch\src\Elasticsearch\Connections\Connection.php on line 615
I GOT IT!
I just had to change
$hosts = [
'http://localhost:80', // SSL to localhost
];
To
$hosts = [
'http://localhost:80' // SSL to localhost
];
(remove the comma)

S3Client PHP SDK: Object of class <class> could not be converted to string

I get the following errors when I try to use the AWS PHP SDK:
PHP Warning: Illegal string offset 'client.backoff' in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 172
PHP Catchable fatal error: Object of class Guzzle\Plugin\Backoff\BackoffPlugin could not be converted to string in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 172
PHP Warning: Illegal string offset 'signature' in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 175
PHP Catchable fatal error: Object of class Aws\S3\S3Signature could not be converted to string in C:\xampp\htdocs\aws_test_local\vendor\aws\aws-sdk-php\src\Aws\S3\S3Client.php on line 175
They originate from the following code inside the S3Client.php file part of the AWS SDK.
public static function factory($config = array())
{
$exceptionParser = new S3ExceptionParser();
// Configure the custom exponential backoff plugin for retrying S3 specific errors
if (!isset($config[Options::BACKOFF])) {
$config[Options::BACKOFF] = static::createBackoffPlugin($exceptionParser);
}
$config[Options::SIGNATURE] = $signature = static::createSignature($config);
...
The Options-class is the Aws\Common\Enum\ClientOptions. If you look at it it defines a lot of constants like this:
const SIGNATURE = 'signature';
const BACKOFF = 'client.backoff';
I call the factory function in the following way:
$s3 = S3Client::factory(_PS_ROOT_DIR_.'/override/aws/aws-config.php');
My aws-config.php file looks like this:
<?php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'XXXXXXXXXXX',
'secret' => 'XXXXXXXXXXX',
'region' => 'eu-west-1'
)
)
)
);
?>
Any ideas? I installed the PHP SDK with Composer, so I'd expect any dependancies to be installed.
The argument to S3Client::factory() is supposed to be an array. You're giving it a filename that contains PHP code to return the array, but S3Client doesn't run the file. Try changing the file to:
<?php
$s3options = array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'XXXXXXXXXXX',
'secret' => 'XXXXXXXXXXX',
'region' => 'eu-west-1'
)
)
)
);
?>
Then your main program can do:
require(_PS_ROOT_DIR_.'/override/aws/aws-config.php');
$s3 = S3Client::factory($s3options);

AWS dynamodb PHP 'Command was not found matching Update_table'

I working on a web application and since a lot of read and write action happening in the amazon dynamodb table am facing a provisionthroughput error. Thinking of using a Update table I used below code but facing error. This error is not mentioned in error handling chart of dynamobb.
<?php
use Aws\DynamoDb\DynamoDbClient;
$dynamoDB = DynamoDbClient::factory(array(
'key' => '',
'secret' => '',
'region' => Region::US_WEST_1
));
####################################################################
# Updating the table
// $dynamodb = new AmazonDynamoDB();
echo PHP_EOL . PHP_EOL;
echo "# Updating the \"${dynamo_db_table}\" table..." . PHP_EOL;
$up = $dynamoDB->Update_table(array(
'TableName' => $dynamo_db_table,
'ProvisionedThroughput' => array(
'ReadCapacityUnits' => 39,
'WriteCapacityUnits' => 37
)
));
$table_status = $dynamoDB->describe_table(array(
'TableName' => $dynamo_db_table
));
// Check for success...
if ($table_status->isOK())
{
print_r($table_status->body->Table->ProvisionedThroughput->to_array()->getArrayCopy());
}
else
{
print_r($table_status);
}
$count = 0;
do {
sleep(5);
$count += 5;
$response = $dynamoDB->describe_table(array(
'TableName' => $table_name
));
}
while ((string) $response->body->Table->TableStatus !== 'ACTIVE');
echo "The table \"${table_name}\" has been updated (slept ${count} seconds)." . PHP_EOL;
?>
I am facing below error:
# Updating the "tablename" table...
Fatal error: Uncaught exception 'Guzzle\Common\Exception\InvalidArgumentException' with message 'Command was not found matching Update_table' in /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php:117 Stack trace: #0 /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php(94): Guzzle\Service\Client->getCommand('Update_table', Array) #1 /home/phretscl/public_html/RETS/vendor/aws/aws-sdk-php/src/Aws/Common/Client/AbstractClient.php(103): Guzzle\Service\Client->__call('Update_table', Array) #2 /home/phretscl/public_html/RETS/file.php(97): Aws\Common\Client\AbstractClient->__call('Update_table', Array) #3 /home/phretscl/public_html/RETS/file.php(97): Aws\DynamoDb\DynamoDbClient->Update_table(Array) #4 {main} thrown in /home/phretscl/public_html/RETS/vendor/guzzle/guzzle/src/Guzzle/Service/Client.php on line 117
The code you are using is not correct. From the error message, I can tell you are using version 2.x of the AWS SDK for PHP, but the code you using from ############ line down looks like it is meant for version 1.x of the AWS SDK for PHP. The error is being thrown, because you are not calling the DynamoDbClient::updateTable() method correctly.
You should checkout the AWS SDK for PHP User Guide, particularly the page about DynamoDB, which has a code sample for UpdateTable.
EDIT w/ regards to the comments: If this is a long running process, you should use the Aws\Common\Aws way of instantiating the client, since it retains and reuses the client objects it creates. Replace you DynamoDbClient::factory(...) code with this:
use Aws\Common\Aws;
$aws = Aws::factory(array(
'key' => '...',
'secret' => '...',
'region' => Region::US_WEST_1
));
$dynamoDB = $aws->get('dynamodb');
You should provide the credentials like this
Also, the region should be like this 'region'=>'us-east-1'
$DDBClient = DynamoDbClient::factory([
'region'=>'us-east-1',
'version' => 'latest',
'credentials' => [
'key' => 'XXXXXXXXXX',
'secret' => 'XXXXXXXXXXXXX'
]
,'scheme' => 'http' // Use this if you don't have HTTPS
//, 'debug' => true
]);

Categories