API endpoint accepting changeable set of data - php

One endpoint of my API must have a changeable data set depending on the context (some option, let's say a domain). How (apart from API documentation, of course) can we inform frontend that we are expecting such a data set? It comes to my mind to write an endpoint returning the fields that I expect in response (more specifically - the entire form with particular types of inputs, their placeholders, default values, etc.).

The potential self-describing API you are looking for is OPTIONS HTTP request.
The OPTIONS method requests information about the communication
options available for the target resource, at either the origin
server or an intervening intermediary. This method allows a client
to determine the options and/or requirements associated with a
resource, or the capabilities of a server, without implying a
resource action.
http://zacstewart.com/2012/04/14/http-options-method.html
For different possible datasets you can use custom Content-Type header like: Content-type: application/vnd+some.payload+json.
The good example of using those types is GitHub API: https://developer.github.com/v3/media/

Return Http 400 Bad Request with JSON response containing URL to JSON Schema to which payload should comply.
Separate endpoint returning JSON Schema is also an option.
I don't know your case, but I dislike a idea of "changeable set of data". Maybe you can consider redesigning your API or wrapping your data sets into kind of container, so all of them can be described by single JSON Schema? (like in sample below)
{
"oneOf": [
{
"type": "object",
"required": [ "domain", "content" ],
"properties": {
"domain":{"type": "string", "enum": ["domain1"]},
"content": {"type": "string"}
}
},
{
"type": "object",
"required": [ "domain", "content" ],
"properties": {
"domain":{"type": "string", "enum": ["domain2"]},
"content": {"type": "number"}
}
}
]
}
depending on "domain" value, content can be either string or number

Related

What is the correct way to read a large JSON API response in Guzzle 6?

I currently have the following Guzzle 6 implementation returning a stream of JSON data containing user data:
$client = new GuzzleHttp\Client([
'base_uri' => 'https://www.apiexample.com',
'handler' => $oauthHandler,
'auth' => 'oauth',
'headers' => [
'Authorization' => 'Bearer xxxxxxxxxxxxxx',
'Content-Type' => 'application/json',
'Accept' => 'application/json',
],
]);
$res = $client->post('example');
$stream = GuzzleHttp\Psr7\stream_for($res->getBody());
The JSON responses look like:
{
"name": "Users",
"record": [
{
"title": "Consulting",
"_name": "Users",
"end date": "07/03/2020",
"session number": "1",
"start date": "09/02/2019",
"course id": "2900",
"first name": "John",
"unique user number": "123456",
"time": "08 AM",
"last name": "Doe",
"year": "19-20",
"location name": "SD"
},
.........
],
"#extensions": "activities,corefields,u_extension,u_stu_x,s_ncea_x,s_stu_crdc_x,c_locator,u_userfields,s_edfi_x"
}
This is being run for a number of clients using different API endpoints. Many of them return too many users for the entire JSON response to be loaded into RAM at once, which is why I am using a stream.
There may be a way to get the API to return chunks incrementally, through multiple calls. But from everything I have gotten from the developers of the API it appears that this is intended to be consumed as one streamed response.
I am new to having to stream an API response like this and am wondering what the correct approach would be to iterate through the records? Looking at the Guzzle 6 docs it appears that the iteration happens by choosing the number x character in the string and grabbing that subsection:
http://docs.guzzlephp.org/en/stable/psr7.html#streams
use GuzzleHttp\Psr7;
$stream = Psr7\stream_for('string data');
echo $stream;
// string data
echo $stream->read(3);
// str
echo $stream->getContents();
// ing data
var_export($stream->eof());
// true
var_export($stream->tell());
// 11
I could potentially write something that parses the strings in subsections through pattern matching and incrementally writes the data to disk as I move through the response. But it seems like that would be error prone and something that would be part of Guzzle 6.
Can you provide an example of how something like this should work or point out where I might be missing something?
I appreciate it, thanks!
But it seems like that would be error prone and something that would be part of Guzzle 6.
Nope, Guzzle is a HTTP client, it has nothing to do with parsing different response formats.
What you need is a JSON streaming parser. Please take a looks at this SO question, and also at the libraries: https://github.com/salsify/jsonstreamingparser, https://github.com/clue/php-json-stream, https://github.com/halaxa/json-machine.
In Guzzle you will have 2 possibilities:
to read the response stream manually (as you do currently), but this probably requires manual integration with a JSON streaming parser
to stream the whole response to a temporary file (see "sink" request option) and read this file later with a JSON streaming parser, this should be supported by all of the libraries.

Laravel checking data with API through API

I have some data that compressed with gzip in an application from here:
app.myaddress.com/data/api/1.
The data contains several parameters in JSON format like follows:
{
"id": 1,
"data": "abcabcabcabcabc" //this is the compressed data
}
I need to check the compressed data with another 3rd party service, we can just say the address like follows: app2.myaddress.com/check_data/abcabc by API request, but it's needed a header authentication:
{
"content-type": "application/json",
"api-key": 123456
}
app2.myaddress.com will return data JSON format like follows:
{
"name": "hello",
"address": "australia"
}
What I need to do is just checking a data by accessing URL like:
app.myaddress.com/data/api/checked/1
then the controller will process the data include checking through app2.myaddress.com and return the value into app.myaddress.com
Solved by #rafitio:
You can use cURL or Guzzle to access both URL inside your function. Here is the documentation of Guzzle: http://docs.guzzlephp.org/en/stable/request-options.html

How to ?$filter data using OData through WebAPI (PHP / cURL)

I'm GETing data in JSON through a WebAPI by sending cURL requests to the server - basically works. Problem starts when I'm trying to filter for a field in (sorry, layman's terms) the 2nd layer of the JSON data.
Here's some example JSON data:
{
"id": "SOMEHASH",
"name": "OPP-Name",
"actualCloseDate": "9999-12-31T00:00:00+01:00",
"companies": [
{
"id": "SOMEOTHERHASH",
"name": "Company Name"
}
],
I'm trying to filter for the field "name" in the array "companies" (that's what I meant with 2nd level), but so far no luck.
My URL to request the data including an OData filter looks like that right now:
https://www.ourserver.tld/Opportunities/?$filter=Company/any(d:d/name eq 'Company Name')
but that only results in an empty response, as if no data were found (the name's a match for sure). To be honest, I don't understand the "d:d" part, just took that straight from the OData docs.
I tried
https://www.ourserver.tld//Opportunities/?$filter=companies/any(name eq 'Company Name')
instead, but that throws a syntax error.
So, how to I form a correct request URL?

Send gzip request with Guzzle

I have to make a HTTP call to send data compressed data. I'm developing in Symfony2. For HTTP calls I'm using Guzzle client (version 3.8.1). Also, I am using Guzzle Service Descriptions to decribe the operations allowed on each command.
I know that I have to add the header "Content-Encoding: gzip" in the request, but the request body is not compressed.
Is there a way to specify in the Guzzle client that the request needs to be compressed? (maybe specify this in the Service Description)
Thank you!
To tell the server to give you compressed version, you have to inform it that you understand how to decompress the data.
For that purpose, you send a header during request which is called Accept-Encoding.
Example of accept-encoding header and values (those are the compression schemes that your client knows to use):
accept-encoding:gzip, deflate, sdch, br
The RESPONSE header Content-Encoding is sent by the server. IF that header is set, then your client asserts that the content is compressed and uses the algorithm that the server sent as value of Content-Encoding.
The server doesn't have to respond with compressed page.
Therefore, these are the steps:
Tell the server you know how to deal with compressed pages. You send accept-encoding header and then you specify which compression algorithms your client knows how to deal with.
Inspect whether the server sent Content-Encoding header. If not, content isn't compressed
If yes, check the value of the header. That tells you which algorithm was used for compression, it doesn't have to be gzip, but usually is.
The server doesn't have to respond with compressed page. You are merely informing the server you understand how to deal with compressed pages.
So, for you, what you should do is verify that your server sends gzipped responses and then you should set the request header accept-encoding. You got it the wrong way around.
I found a solution to send data compressed using Guzzle client with Operation Command and Service Description.
In the JSON file containing the service description I have specified that data sent in body is a string:
{
...
"operations": {
"sendCompressedData": {
"httpMethod": "POST",
"uri": ...,
"parameters": {
"Content-Type": {
"location": "header",
"required": true,
"type": "string",
"default": "application/json"
},
"Content-Encoding": {
"location": "header",
"required": true,
"type": "string",
"default": "gzip"
},
"data": {
"location": "body",
"required": true,
"type": "string"
}
}
}
}
}
As mentioned by #Mjh, Guzzle doesn't compress data automatically if "Content-Encoding" header is set, so data needs to be compressed before sending it to the Guzzle client to execute the command. I have serialized the object and used "gzencode($string)" for compressions.
$serializedData = SerializerBuilder::create()->build()->serialize($request, 'json');
$compressedData = gzencode($serializedData);
...
$command = $this->client->getCommand('sendCompressedData', array('data' => $compressedData));
$result = $command->execute();

Modifying User-Attribute on SharePoint data item using HTTP

I'm using the (JSON) HTTP interface to communicate with SharePoint. The communication itself is done via cURL and a convinience-wrapper in PHP. Problems arise, when I want to push data to SP.
Since I'm no Microsoft / SharePoint guy, I'm missing the proper vocabulary to explain my needs. I'll therefore demonstrate using data I received from SharePoint.
GET http://example.org/mytest/_vti_bin/listdata.svc/Aufgaben(2) returns the following (truncated by me) data:
{
"d" : {
"__metadata": {
"uri": "http://example.org/mytest/_vti_bin/listdata.svc/Aufgaben(2)",
"etag": "W/\"5\"",
"type": "Microsoft.SharePoint.DataService.AufgabenItem"
},
"ID": 2,
"InhaltstypID": "0x010800821BC29B80192B4C960A688416597526",
"Inhaltstyp": "Aufgabe",
"Titel": "Neuer Titel",
"ZugewiesenAn": {
"__deferred": {
"uri": "http://example.org/mytest/_vti_bin/listdata.svc/Aufgaben(2)/ZugewiesenAn"
}
},
"ZugewiesenAnId": 29,
"F\u00e4lligkeitsdatum": "\/Date(1323993600000)\/"
}
}
"ZugewiesenAn" is a user. If I query the deferred values, I get (truncated by me, again)
{
"d" : {
"__metadata": {
"uri": "http://example.org/mytest/_vti_bin/listdata.svc/Benutzerinformationsliste(29)",
"etag": "W/\"1\"",
"type": "Microsoft.SharePoint.DataService.BenutzerinformationslisteItem"
},
"InhaltstypID": "0x010A000719C31710976A48867763D86F6586E0",
"Name": "Rehm Rodney",
"Konto": "EXT\\rodney.rehm",
"ID": 29,
"Inhaltstyp": "Person",
}
}
So I can see that the value of "ZugewiesenAn" should be "EXT\rodney.rehm" (as I need the username). Thus far, no problem.
The question is how I create a new or update an existing object with a different user for "ZugewiesenAn" (a User/Group field)?
I've tried:
Sending the username as the value of "ZugewiesenAn" or "ZugewiesenAnId" results in a Bad Request.
Querying http://example.org/_vti_bin/People.asmx (SOAP: SearchPrincipals) only yields numeric IDs for people that have actually worked with the list. If I query a username that hasn't logged into that SharePoint list before, I get ID -1.
I could not find out how to add users to the userlist via REST. You can, however, use the SOAP ResolvePrincipal request (example) - which does the job!
I am not a SharePoint guy and focus mostly on REST and OData. But think that REST OData API for SharePoint follows common rules for REST OData.
Common rules for REST and OData is to use different HTTP verbs for different operations. Read, create, update, and delete operations are mapped directly to GET, POST, PUT, and DELETE HTTP verbs.
So, you are getting your user by GET HTTP verb on URI http://example.org/mytest/_vti_bin/listdata.svc/Benutzerinformationsliste(29)
To delete this user use verb DELETE on the same URI and user id with empty HTTP message body.
To create user HTTP verb POST, same URI and json in message body. Also while creating ID shouldn't be specified (except situations when ID isn't auto-incremented in databases). Content-Type for HTTP message should be set: application/json for JSON.
The same situation with update - PUT, same URI
http://example.org/mytest/_vti_bin/listdata.svc/Benutzerinformationsliste(29)
and json in HTTP message body with content-type:application/json.
Format of json should be the same as you've received.
{
"InhaltstypID": "0x010A000719C31710976A48867763D86F6586E0",
"Name": "Rehm Rodney",
"Konto": "EXT\\rodney.rehm",
"ID": 29,
"Inhaltstyp": "Person",
}

Categories