Docusign rate limit to receive documents in 1 envelope - php

I am trying to switch to the production API of Docusign. When I submit the required 20 envelopes for approval they do not get approved. I recieved a log file that lists multiple GET requests. It violates the API rules, only one GET request per envelope per 15 minutes is allowed according to the documentation. (https://developers.docusign.com/esign-rest-api/guides/resource-limits)
When I list my envelope and loop through the envelope multiple times to get the documents out of it. I do multiple GET requests to the same envelope and that's why I think I get a rate limit error.
In the example below, you can see that when I retrieve the envelope, I immediately loop over the documents inside the envelope and get the documents with the getDucument method as described in the documentation. (https://developers.docusign.com/esign-rest-api/code-examples/get-an-envelope-document-list)
public function getEnvelopeDocument ($envelopeId)
{
$documents = $this->envelopeApi->listDocuments(config('docusign.id'), $envelopeId);
try {
foreach($documents->getEnvelopeDocuments() as $document)
{
$docs[] = $this->envelopeApi->getDocument((config('docusign.id')), $document->getDocumentId(), $envelopeId);
}
} catch (ApiException $e){
dd("Error connecting Docusign : " . $e->getResponseBody()->errorCode . " " . $e->getResponseBody()->message);
}
}
Am I violating the API rate limiter? If so, what would be the allowed way to retrieve documents inside an envelope.

My interpretation of the code is that you are performing the following calls in sequence:
GET /envelopes/{envelopeId}/documents - ListDocuments
GET /envelopes/{envelopeId}/documents/1 - get document 1
GET /envelopes/{envelopeId}/documents/2 - get document 2
and so on.
If this is the case, you are not in violation of the API limit. If you were to make two calls to the ListDocuments or to one of the individual documents within 15 minutes that would be a polling violation.
To confirm everything is acceptable, you might capture API logs to confirm you're hitting each unique endpoint only once. Info on API logs is available here: https://support.docusign.com/guides/ndse-user-guide-api-request-logging

Related

DocuSign API: sending a signature request via e-mail with the Envelope ID

I am trying to do the following with the DocuSign API, but I am not getting very much wiser from their documentation.
The admin creates the envelope in DocuSign just as they normally would, but without sending it
They take the Envelope ID and enter it in our software
We show a button to the end user that sends out the Envelope signature request when clicked (based on the Envelope ID)
The closest I came to finding something like this was https://developers.docusign.com/docs/esign-rest-api/how-to/request-signature-email-remote/ but that doesn't seem to allow me to use and existing envelope.
The API reference doesn't seem to offer any help either (https://developers.docusign.com/docs/esign-rest-api/reference/envelopes/envelopes/)
What I have got so far is the following:
OAuth + generating the JWT + access token (works fine)
generating Recipient View (which is not what I need but needs to be replaced with the right call
$view_request = new \DocuSign\eSign\Model\RecipientViewRequest(['return_url' => $args['ds_return_url']]);
if ($args['starting_view'] == "envelope" && $args['envelope_id']) {
$view_request->setEnvelopeId($args['envelope_id']);
}
# Call the API method
$config = new \DocuSign\eSign\Configuration();
$config->setHost($args['base_uri']);
$config->addDefaultHeader('Authorization', 'Bearer ' . $args['ds_access_token']);
$api_client = new \DocuSign\eSign\Client\ApiClient($config);
$envelope_api = new \DocuSign\eSign\Api\EnvelopesApi($api_client);
$results = $envelope_api->createRecipientView($args['account_id'], $view_request);
$view_url = $results['url'];
return $view_url;
Thanks!
So based on your description it looks like for the first step, you're looking to create an envelope as a draft. Which is basically creating an envelope, filling all the information out, and then not sending it.
This will spit out an envelope Id which you can store in your application.
And when the button you describe is clicked, you can update the status of the envelope to "sent" using this endpoint which will send out the envelope.
If you're looking for something more detailed, you can always reach out to us at DocuSign Developer Support and we can discuss it further.

GMail API - Getting certain message headers or fields

I have successfully connected GMail API via G Suite account and service account. I can get a message list and I can retrieve messages by IDs. I'm working with PHP.
What I'm having problems with is to get for example the FROM or TO headers, SUBJECT or the snippet field.
$optParam = array('format' => 'metadata', 'metadataHeaders'=>['subject','from'], 'fields'=>['snippet','labelIds']);
$fullMessage = $service->users_messages->get($user, $id, $optParam);
This will return the snippet, but not the subject or from or the labelIds.
If I use the GMail "Try this API" and use the id of the message and use "snippet" in the "fields" entry, I just get the snippet back as:
{
"snippet": "Short snippet of the message"
}
If I use:
$optParam = array('format' => 'metadata', 'metadataHeaders'=>['subject','from','to']);
I do get the 3 headers, but I also get a lot more information, including the labels and snippet - about 3K for each message.
I just can't seem to be able to specify a small subset of the data. All I need is to show messages as a list with the subject, date/time, from/to.
I don't care so much about the amount of data, but it takes on average about 3.5 seconds to retrieve the data for just 14 message!
Is there a way to restrict this so I don't get all the "extra" data or speed the retrieval up somehow?
Sending the request would involve specifying the metadata keys as well as the parameter names of the fields you want to obtain. You can use an HTTP GET request with the URI to get the ‘to’, ‘from’, ‘subject’ and ‘snippet’ with https://www.googleapis.com/gmail/v1/users/me/messages/<MESSAGE_ID>?format=metadata&metadataHeaders=to&metadataHeaders=from&metadataHeaders=subject&fields=snippet%2C+payload%2Fheaders, which will also limit the headers you obtain.
In PHP you can use this:
$optParam = array('format' => 'metadata', 'metadataHeaders'=>['subject', 'from', 'to'], 'fields'=>'payload/headers,snippet');
Note that the fields parameter needs to be sent as a string and not an array.
Also be aware there is a known issue with the Gmail API where using the https://www.googleapis.com/auth/gmail.metadata scope will not return the snippet.
You’ll need to use https://www.googleapis.com/auth/gmail.readonly instead.
You can also make a batch of requests in one network call which will help speed up overall execution time as documented here.

Rate-limiting Guzzle Requests in Symfony

This actually follows on from a previous question I had that, unfortunately, did not receive any answers so I'm not exactly holding my breath for a response but I understand this can be a bit of a tricky issue to solve.
I am currently trying to implement rate limiting on outgoing requests to an external API to match the limit on their end. I have tried to implement a token bucket library (https://github.com/bandwidth-throttle/token-bucket) into the class we are using to manage Guzzle requests for this particular API.
Initially, this seemed to be working as intended but we have now started seeing 429 responses from the API as it no longer seems to be correctly rate limiting the requests.
I have a feeling what is happening is that the number of tokens in the bucket is now being reset every time the API is called due to how Symfony handles services.
I am setting currently setting up the bucket location, rate and starting amount in the service's constructor:
public function __construct()
{
$storage = new FileStorage(__DIR__ . "/api.bucket");
$rate = new Rate(50, Rate::MINUTE);
$bucket = new TokenBucket(50, $rate, $storage);
$this->consumer = new BlockingConsumer($bucket);
$bucket->bootstrap(50);
}
I'm then attempting to consume a token before each request:
public function fetch(): array
{
try {
$this->consumer->consume(1);
$response = $this->client->request(
'GET', $this->buildQuery(), [
'query' => array_merge($this->params, ['api_key' => $this->apiKey]),
'headers' => [ 'Content-type' => 'application/json' ]
]
);
} catch (ServerException $e) {
// Process Server Exception
} catch (ClientException $e) {
// Process Client Exception
}
return $this->checkResponse($response);
}
I can't see anything obvious in that, that would allow it to request more than 50 times per minute unless the amount of available tokens was being reset on each request.
This is being supplied to a set of repository services that handle converting the data from each endpoint into objects used within the system. Consumers use the appropriate repository to request the data needed to complete their process.
If the amount of tokens is being reset by the bootstrap function being in service constructor, where should it be moved to within the Symfony framework that would still work with consumers?
I assume that it should work, but maybe try to move the ->bootstrap(50) call from every request? Not sure, but it can be the reason.
Anyway it's better to do that only once, as a part of your deployment (every time you deploy a new version). It doesn't have anything with Symfony, really, because the framework doesn't have any restrictions on deployment procedure. So it depends on how you do the deployment.
P.S. Have you considered to just handle 429 errors from the server? IMO you can wait (that's what BlockingConsumer does inside) when you receive 429 error. It's simpler and doesn't require an additional layer in your system.
BTW, have you considered nginx's ngx_http_limit_req_module as an alternative solution? It usually comes with nginx by default, so no additional actions to install, only a small configuration is required.
You can place an nginx proxy behind your code and the target web service and enable limits on it. Then in your code you will handle 429 as usual, but the requests will be throttled by your local nginx proxy, not by the external web service. So the final destination will get only limited amount of requests.
I have found a trick using Guzzle bundle for symfony.
I had to improve a sequential program sending GET requests to a Google API. In code example, it a pagespeed URL.
To have a rate limit, there an option to delay the requests before they are sent asynchronously.
Pagespeed rate limit is 200 requests per minute.
A quick calculation gives 200/60 = 0.3s per request.
Here is the code I tested on 300 urls, getting a fantastic result of no error, except if the url passed as a parameter in the GET request gives a 400 HTTP Error (Bad request).
I put a delay of 0.4s and the average result time is less then 0.2s, whereas it took more than a minute with a sequential program.
use GuzzleHttp;
use GuzzleHttp\Client;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\Exception\ClientException;
// ... Now inside class code ... //
$client = new GuzzleHttp\Client();
$promises = [];
foreach ($requetes as $i=>$google_request) {
$promises[] = $client->requestAsync('GET', $google_request ,['delay'=>0.4*$i*1000]); // delay is the trick not to exceed rate limit (in ms)
}
GuzzleHttp\Promise\each_limit($promises, function(){ // function returning the number of concurrent requests
return 100; // 1 or 100 concurrent request(s) don't really change execution time
}, // Fulfilled function
function ($response,$index)use($urls,$fp) { // $urls is used to get the url passed as a parameter in GET request and $fp a csv file pointer
$feed = json_decode($response->getBody(), true); // Get array of results
$this->write_to_csv($feed,$fp,$urls[$index]); // Write to csv
}, // Rejected function
function ($reason,$index) {
if ($reason instanceof GuzzleHttp\Exception\ClientException) {
$message = $reason->getMessage();
var_dump(array("error"=>"error","id"=>$index,"message"=>$message)); // You could write the errors to a file or database too
}
})->wait();

Geocode API query limit exceed

Geocode API allowed 2500 requests per day, but my site is not sending such requests in a single day. is my code sending automated requests?
function getmap($addrss,$location,$city){
$address = $addrss.' '.$city.' '.$location;
$prepAddr = urlencode($address);
$geocode=file_get_contents('http://maps.google.com/maps/api/geocode/json?address='.$prepAddr.'&sensor=false');
$output= json_decode($geocode);
$lat = $output->results[0]->geometry->location->lat;
$long = $output->results[0]->geometry->location->lng;
$retval = $lat.'*'.$long;
return $retval;
}
Response came
[error_message] => You have exceeded your daily request quota for this API.....[status] => OVER_QUERY_LIMIT )
Please provide any suggestion.
From the API Example I've looked up it specifically requests appending the API Key to the request URL, might that be it? Otherwise my guess would be accidental looped requests.
It would be best if the API Key was set outside this function in case other functions in this class / object want to access the API as well.
$geocode=file_get_contents('http://maps.google.com/maps/api/geocode/json?address='.$prepAddr.'&sensor=false&key='.$API_KEY);
Try to use google browser key as below in order to get more number of requests per day
http://maps.google.com/maps/api/geocode/json?key=YOUR_GOOGLE_MAP_API_KEY&address=los+Angels
Follow these steps to get GOOGLE API KEY,
Go to the Google Developers Console.
Create or select a project.
Click Continue to enable the API and any related services.
On the Credentials page, get a Browser key (and set the API
Credentials).
(Optional) Enable billing.
Note: If you have an existing Browser key, you may use that key.
To prevent quota theft, secure your API key following these best practices.

PHP Azure SDK Service Bus returns Malformed Response

I'm working on trace logger of sorts that pushes log message requests onto a Queue on a Service Bus, to later be picked off by a worker role which would insert them into the table store. While running on my machine, this works just fine (since I'm the only one using it), but once I put it up on a server to test, it produced the following error:
HTTP_Request2_MessageException: Malformed response: in D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php on line 1013
0 HTTP_Request2_Response->__construct('', true, Object(Net_URL2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:1013
1 HTTP_Request2_Adapter_Socket->readResponse() D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:139
2 HTTP_Request2_Adapter_Socket->sendRequest(Object(HTTP_Request2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2.php:939
3 HTTP_Request2->send() D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\Http\HttpClient.php:262
4 WindowsAzure\Common\Internal\Http\HttpClient->send(Array, Object(WindowsAzure\Common\Internal\Http\Url)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\RestProxy.php:141
5 WindowsAzure\Common\Internal\RestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\ServiceRestProxy.php:86
6 WindowsAzure\Common\Internal\ServiceRestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:139
7 WindowsAzure\ServiceBus\ServiceBusRestProxy->sendMessage('<queuename>/mes…', Object(WindowsAzure\ServiceBus\Models\BrokeredMessage)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:155
⋮
I've seen previous posts that describe similar issues; Namely:
Windows Azure PHP Queue REST Proxy Limit (Stack Overflow)
Operations on HTTPS do not work correctly (GitHub)
That imply that this is a known issue regarding the PHP Azure Storage libraries, where there are a limited amount of HTTPS connections allowed. Before requirements were changed, I was accessing the table store directly, and ran into this same issue, and fixed it in the way the first link describes.
The problem is that the Service Bus endpoint in the connection string, unlike Table Store (etc.) connection string endpoints, MUST be 'HTTPS'. Trying to use it with 'HTTP' will return a 400 - Bad Request error.
I was wondering if anyone had any ideas on a potential workaround. Any advice would be greatly appreciated.
Thanks!
EDIT (After Gary Liu's Comment):
Here's the code I use to add items to the queue:
private function logToAzureSB($source, $msg, $severity, $machine)
{
// Gather all relevant information
$msgInfo = array(
"Severity" => $severity,
"Message" => $msg,
"Machine" => $machine,
"Source" => $source
);
// Encode it to a JSON string, and add it to a Brokered message.
$encoded = json_encode($msgInfo);
$message = new BrokeredMessage($encoded);
$message->setContentType("application/json");
// Attempt to push the message onto the Queue
try
{
$this->sbRestProxy->sendQueueMessage($this->azureQueueName, $message);
}
catch(ServiceException $e)
{
throw new \DatabaseException($e->getMessage, $e->getCode, $e->getPrevious);
}
}
Here, $this->sbRestProxy is a Service Bus REST Proxy, set up when the logging class initializes.
On the recieving end of things, here's the code on the Worker role side of this:
public override void Run()
{
// Initiates the message pump and callback is invoked for each message that is received, calling close on the client will stop the pump.
Client.OnMessage((receivedMessage) =>
{
try
{
// Pull the Message from the recieved object.
Stream stream = receivedMessage.GetBody<Stream>();
StreamReader reader = new StreamReader(stream);
string message = reader.ReadToEnd();
LoggingMessage mMsg = JsonConvert.DeserializeObject<LoggingMessage>(message);
// Create an entry with the information given.
LogEntry entry = new LogEntry(mMsg);
// Set the Logger to the appropriate table store, and insert the entry into the table.
Logger.InsertIntoLog(entry, mMsg.Service);
}
catch
{
// Handle any message processing specific exceptions here
}
});
CompletedEvent.WaitOne();
}
Where Logging Message is a simple object that basically contains the same fields as the Message Logged in PHP (Used for JSON Deserialization), LogEntry is a TableEntity which contains these fields as well, and Logger is an instance of a Table Store Logger, set up during the worker role's OnStart method.
This was a known issue with the Windows Azure PHP, which hasn't been looked at in a long time, nor has it been fixed. In the time between when I posted this and now, We ended up writing a separate API web service for logging, and had our PHP Code send JSON strings to it over cURL, which works well enough as a temporary work around. We're moving off of PHP now, so this wont be an issue for much longer anyways.

Categories