I'm using amazon sdk v2 and using aws factory for dynamoDB and I have a simple putItem operation but I have no idea how I can make sure putItem was successful or not because putItem returns a Model which doesnt contain any information about status of operation. Any idea?
Here is my code
class DynamoLogger{
protected $client;
protected $tableName;
public function __construct(ServiceBuilder $builder, $tableName)
{
$this->client = $builder->get('dynamodb');
$this->tableName = $tableName;
}
public function log(Request $request)
{
$model = $this->client->putItem(array(
'TableName' => $this->tableName,
'Item' => array(
'cc_id' => array(
'S' => $request->get('cc_id')
),
'date' => array(
'S' => date('Y-m-d H:i:s') . substr((string)microtime(), 1, 8)
),
'tt_id' => array(
'N' => $request->get('tt_id')
),
'action_name' => array(
'S' => $request->get('name')
),
'action_value' => array(
'S' => $request->get('value')
),
'gg_nn' => array(
'S' => $request->get('gg_nn')
),
'ffr_id' => array(
'N' => $request->get('ffr_id')
)
),
'ReturnValues' => 'ALL_OLD'
));
return $model;
}
}
With the AWS SDK for PHP 2.x, you should assume that any operation that returns without throwing an exception was successful. In the case of DynamoDB, an Aws\DynamoDb\Exception\DynamoDbException (or subclass) will be thrown if there was an error. Also, in the case of DynamoDB, the service will not respond until your item has been written to at least 2 locations, ensuring the integrity of your data.
Additionally, with the AWS SDK for PHP 2.x, you can use the long-form command syntax in order to have access to the Guzzle Request and Response objects, if you are interested in introspecting them. Here is an example:
$command = $client->getCommand('PutItem', array(/*...params...*/));
$model = $command->getResult(); // Actually executes the request
$request = $command->getRequest();
$response = $command->getResponse();
var_dump($response->isSuccessful());
Please also see the Commands and Response Models sections of the AWS SDK for PHP User Guide.
Related
If I put $author='Steven King'; it works without issue, however it does not work with a post variable.
To be "clear" if I hard code the the author in the JSON it will in fact post the message to the SQS queue. This is the expected result, however if I pass the string from the Post Variable e.g. $author=$_POST['author], the message is never delivered.
$message = array(
// Associative array of custom 'String' key names
'Author' => array(
'StringValue' =>$author,
'DataType' => 'String'
),
);
Any thoughts or help on this I would be grateful.
<?php
$author =$_POST["author"];
require 'vendor/autoload.php';
use Aws\Common\Aws;
use Aws\Sqs\SqsClient;
use Aws\Exception\AwsException;
// Get the client from the builder by namespace
$client = SqsClient::factory(array(
'profile' => 'default',
'region' => 'us-west-2',
'version' => '2012-11-05'
));
$queueUrl ='https://sqs.us-west-2.amazonaws.com/blahblahblah';
$message = array(
// Associative array of custom 'String' key names
'Author' => array(
'StringValue' =>$author,
'DataType' => 'String'
),
);
var_dump($message);
$result = $client->sendMessage(array(
'QueueUrl' => $queueUrl,
'MessageBody' => 'An awesome message!',
'MessageAttributes' =>$message,
));
So the issue was caused by the credential provider, which is why it worked in the cli e.g. php posts.php and not in the browser. Because in the CLI it has the correct environment and permissions.
Note: However on AWS SDK example does not include the credential provider only a reference to 'profile' => 'default' which would be thought to grab your default credentials, however that is not the case.
Thank you apache logs!
The fix was to set the right permissions on the /.aws/credentials and ensure
that your $HOME path is set correctly.
On to the next piece. Thanks community.
We began seeing these DocuSign exceptions 09/24/2019:
DocuSign \ eSign \ ApiException (401)
[401] Error connecting to the API (https://NA3.docusign.net/restapi/v2/login_information)
None of the code surrounding our DocuSign logic has been touched for almost six months. So I'm at a loss as to why this exception is being thrown.
We're using the following packages (relating to this):
laravel/framework v5.8.35
docusign/esign-client 3.0.1
tucker-eric/docusign-rest-client 1.0.0
tucker-eric/laravel-docusign 0.1.1
I've tried to update the packages with composer thinking they might have made updates to fix something, but it didn't change anything other than throw USER_AUTHENTICATION_FAILED instead of the exceptions' message above.
As I said, no code has been touched, and I have very little experience with the DocuSign API, and making matters worse this was an old developer's code...
I am able to hit the endpoint, and authenticate with our credentials, using Postman and it seems to work fine. So again, I'm not sure how this just started happening.
The code from our controller:
$parcel = request('parcel_id');
$subdivision = $user->subdivision_id;
$subEmail = Subdivision::where('id', $user->subdivision_id)->pluck('email')->first();
$move = Move::create([
'full_name' => request('full_name'),
'email' => request('email'),
'phone_number' => request('phone_number'),
'parcel_id' => $parcel,
'direction' => request('direction'),
'action_date' => request('action_date'),
'user_id' => auth()->id(),
'subdivision_id' => $subdivision
]);
$residentTabs = array(
array(
'tabLabel' => env('MOVE_IN_ADDRESS_FIELD'),
'value' => $move->parcel->MailingAddress
),
array(
'tabLabel' => env('MOVE_IN_DATE_RESIDENT_FIELD'),
'value' => $move->action_date->format('m/d/Y')
),
array(
'tabLabel' => env('MOVE_IN_EMAIL_FIELD'),
'value' => $move->email
),
array(
'tabLabel' => env('MOVE_IN_PRIMARY_PHONE_FIELD'),
'value' => $move->phone_number
),
array(
'tabLabel' => env('MOVE_IN_FULL_NAME_FIELD'),
'value' => $move->full_name
)
);
$pmTabs = array(
array(
'tabLabel' => env('MOVE_IN_PM_ADDRESS_FIELD'),
'value' => $move->parcel->MailingAddress
),
array(
'tabLabel' => env('MOVE_IN_PM_DATE_FIELD'),
'value' => $move->action_date->format('m/d/Y')
),
);
$templateRoles = array(
array(
'email' => $move->email,
'name' => $move->full_name,
'roleName' => 'Resident',
'tabs' => array(
'textTabs' => $residentTabs
)
),
array(
'email' => $subEmail,
'name' => $user->name,
'roleName' => 'Property Manager',
'tabs' => array(
'textTabs' => $pmTabs
)
)
);
$envelopeDefinition = array(
'status' => 'sent',
'templateId' => env("DOCUSIGN_TEMPLATE_ID"),
'templateRoles' => $templateRoles
);
$contract = DocuSign::get('envelopes')->createEnvelope($envelopeDefinition);
The last line is where the exception is thrown, and the function throwing the exceptions is:
vendor/docusign/esign-client/src/ApiClient.php::callApi
We expect it to work as it has, throwing no exceptions and creating the envelope successfully.
However, we have been seeing USER_AUTHENTICATION_FAILED and general 401 exceptions.
Any help is appreciated!
Your token may have expired. Not sure how it was created and what authentication mechanism you are using. You need to check where is the token and the header in the REST API calls that is using it. It may be that was hardcoded, or was there a refresh token used to keep obtaining new tokens and that process broke.
If you're getting an Authentication failure while trying to hit the login_information endpoint, it's likely that your application is using Legacy Header authentication with an invalid password.
I'd recommend the following:
Try to log in to the web console at www.docusign.net, and perform a Password Reset if necessary
Once you are able to log in, update the stored credentials in the application
2FA or forced Single Sign-On will both block Legacy Header auth. If either is in place, they will need to be disabled, or you will need to switch to one of the Account Server auth workflows.
I am working on a project where we will be creating both subdomains as well as domains in Route53. We are hoping that there is a way to do this programmatically. The SDK for PHP documentation seems a little light, but it appears that createHostedZone can be used to create a domain or subdomain record and that changeResourceRecordSets can be used to create the DNS records necessary. Does anyone have examples of how to actually accomplish this?
Yes, this is possible using the changeResourceRecordSets call, as you already indicated. But it is a bit clumsy since you have to structure it like a batch even if you're changing/creating only one record, and even creations are changes. Here is a full example, without a credentials method:
<?php
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\Route53\Route53Client;
use Aws\Common\Credentials\Credentials;
$client = Route53Client::factory(array(
'credentials' => $credentials
));
$result = $client->changeResourceRecordSets(array(
// HostedZoneId is required
'HostedZoneId' => 'Z2ABCD1234EFGH',
// ChangeBatch is required
'ChangeBatch' => array(
'Comment' => 'string',
// Changes is required
'Changes' => array(
array(
// Action is required
'Action' => 'CREATE',
// ResourceRecordSet is required
'ResourceRecordSet' => array(
// Name is required
'Name' => 'myserver.mydomain.com.',
// Type is required
'Type' => 'A',
'TTL' => 600,
'ResourceRecords' => array(
array(
// Value is required
'Value' => '12.34.56.78',
),
),
),
),
),
),
));
The documentation of this method can be found here. You'll want to take very careful note of the required fields as well as the possible values for others. For instance, the name field must be a FQDN ending with a dot (.).
Also worth noting: You get no response back from the API after this call by default, i.e. there is no confirmation or transaction id. (Though it definitely gives errors back if something is wrong.) So that means that if you want your code to be bulletproof, you should write a Guzzle response handler AND you may want to wait a few seconds and then run a check that the new/changed record indeed exists.
Hope this helps!
Yes, I done using changeResourceRecordSets method.
<?php
require 'vendor/autoload.php';
use Aws\Route53\Route53Client;
use Aws\Exception\CredentialsException;
use Aws\Route53\Exception\Route53Exception;
//To build connection
try {
$client = Route53Client::factory(array(
'region' => 'string', //eg . us-east-1
'version' => 'date', // eg. latest or 2013-04-01
'credentials' => [
'key' => 'XXXXXXXXXXXXXXXXXXX', // eg. VSDFAJH6KXE7TXXXXXXXXXX
'secret' => 'XXXXXXXXXXXXXXXXXXXXXXX', //eg. XYZrnl/ejPEKyiME4dff45Pds54dfgr5XXXXXX
]
));
} catch (Exception $e) {
echo $e->getMessage();
}
/* Create sub domain */
try {
$dns = 'yourdomainname.com';
$HostedZoneId = 'XXXXXXXXXXXX'; // eg. A4Z9SD7DRE84I ( like 13 digit )
$name = 'test.yourdomainname.com.'; //eg. subdomain name you want to create
$ip = 'XX.XXXX.XX.XXX'; // aws domain Server ip address
$ttl = 300;
$recordType = 'CNAME';
$ResourceRecordsValue = array('Value' => $ip);
$client->changeResourceRecordSets([
'ChangeBatch' => [
'Changes' => [
[
'Action' => 'CREATE',
"ResourceRecordSet" => [
'Name' => $name,
'Type' => $recordType,
'TTL' => $ttl,
'ResourceRecords' => [
$ResourceRecordsValue
]
]
]
]
],
'HostedZoneId' => $HostedZoneId
]);
}
If you get any error please check into server error.log file. If you get error from SDK library then there is might PHP version not supported.
if you run this code from your local machine then you might get "SignatureDoesNotMatch" error then Make sure run this code into same (AWS)server environment.
All,
I am attempting to migrate roughly 6GB of Mongo data that is comprised of hundreds of collections to DynamoDB. I have written some scripts using the AWS PHP SDK and am able to port over very small collections but when I try ones that have more than 20k documents (still a very small collection all things considered) it either takes an outrageous amount of time or quietly fails.
Does anyone have some tips/tricks for taking data from Mongo (or any other NoSQL DB) and migrating it to Dynamo, or any other NoSQL DB. I feel like this should be relatively easy because the documents are extremely flat/simple.
Any thoughts/suggestions would be much appreciated!
Thanks!
header.php
<?
require './aws-autoloader.php';
require './MongoGet.php';
set_time_limit(0);
use \Aws\DynamoDb\DynamoDbClient;
$client = \Aws\DynamoDb\DynamoDbClient::factory(array(
'key' => 'MY_KEY',
'secret' => 'MY_SECRET',
'region' => 'MY_REGION',
'base_url' => 'http://localhost:8000'
));
$collection = "AccumulatorGasPressure4093_raw";
function nEcho($str) {
echo "{$str}<br>\n";
}
echo "<pre>";
test-store.php
<?
include('test-header.php');
nEcho("Creating table(s)...");
// create test table
$client->createTable(array(
'TableName' => $collection,
'AttributeDefinitions' => array(
array(
'AttributeName' => 'id',
'AttributeType' => 'N'
),
array(
'AttributeName' => 'count',
'AttributeType' => 'N'
)
),
'KeySchema' => array(
array(
'AttributeName' => 'id',
'KeyType' => 'HASH'
),
array(
'AttributeName' => 'count',
'KeyType' => 'RANGED'
)
),
'ProvisionedThroughput' => array(
'ReadCapacityUnits' => 10,
'WriteCapacityUnits' => 20
)
));
$result = $client->describeTable(array(
'TableName' => $collection
));
nEcho("Done creating table...");
nEcho("Getting data from Mongo...");
// instantiate class and get data
$mGet = new MongoGet();
$results = $mGet->getData($collection);
nEcho ("Done retrieving Mongo data...");
nEcho ("Inserting data...");
$i = 0;
foreach($results as $result) {
$insertResult = $client->putItem(array(
'TableName' => $collection,
'Item' => $client->formatAttributes(array(
'id' => $i,
'date' => $result['date'],
'value' => $result['value'],
'count' => $i
)),
'ReturnConsumedCapacity' => 'TOTAL'
));
$i++;
}
nEcho("Done Inserting, script ending...");
I suspect that you are being throttled by DynamoDB, especially if your tables' throughputs are low. The SDK retries the requests, up to 11 times per request, but eventually, the requests fail, which should throw an exception.
You should take a look at the WriteRequestBatch object. This object is basically a queue of items that get sent in batches, but any items that fail to transfer are re-queued automatically. Should provide a more robust solution for what you are doing.
Using the latest CakePHP build 1.3.6.
I'm writing a custom datasource for a external REST API. I've got all the read functionality working beautifully. I'm struggling with the Model::save & Model::create.
According to the documentation, the below methods must be implemented (see below and notice it does not mention calculate). These are all implemented. However, I was getting an "Fatal error: Call to undefined method ApiSource::calculate()". So I implemented the ApiSource::calculate() method.
describe($model) listSources() At
least one of:
create($model, $fields = array(), $values = array())
read($model, $queryData = array())
update($model, $fields = array(), $values = array())
delete($model, $id
= null)
public function calculate(&$model, $func, $params = array())
{
pr($model->data); // POST data
pr($func); // count
pr($params); // empty
return '__'.$func; // returning __count;
}
If make a call from my model
$this->save($this->data)
It is calling calculate, but none of the other implemented methods. I would expect it to either call ApiSource::create() or ApiSource::update()
Any thoughts or suggustions?
Leo, you tipped me in the right direction. The answer was in the model that was using the custom datasource. That model MUST define your _schema.
class User extends AppModel
{
public $name = 'User';
public $useDbConfig = 'cvs';
public $useTable = false;
public $_schema = array(
'firstName' => array(
'type' => 'string',
'length' => 30
),
'lastName' => array(
'type' => 'string',
'length' => 30
),
'email' => array(
'type' => 'string',
'length' => 50
),
'password' => array(
'type' => 'string',
'length' => 20
)
);
...
}
I'm guessing that if you implement a describe() method in the custom datasource that will solve the problem too. In this case it needed to be predefined to authorize the saves and/or creation.
From the API: http://api13.cakephp.org/class/dbo-source#method-DboSourcecalculate
"Returns an SQL calculation, i.e. COUNT() or MAX()"
A quick search in ~/cake finds 20 matches in 8 files. One of those is the definition in dbo_source.php
The other seven are:
dbo_source.test.php
code_coverage_manager.test.php
code_coverage_manager.php
dbo_db2.php
model.php
tree.php
containable.php
Without delving too deeply into this, I suspect your problem lies in Model::save
You'll probably have to define a calculate method to suit the structure of your custom datasource because Cake won't know how to do that.