Most Ideal Way for Handling JSON Data Fetch - php

I'm building an app for iOS devices and would like the app to be able to fetch potentially large amounts of data from a MySQL database using a PHP file that returns a JSON object.
For testing purposes I had model data inside my controller. To begin to use a MVC architecture I want to come up with a way for my model to fetch this data from the database and then allow my controller to display it after being fetched.
For example, say I have a collection of groceries which I can fetch within my Groceries object using my initWithJSON method. And I wanted to fetch this using AFNetworking.
[
{
"item": "eggs",
"color": "white",
"shape": "oval",
},
{
"item": "bread",
"color": "brown",
"shape": "rectangle"
},
{
"item": "cheerios",
"color": "green/orange",
"shape": "circle"
}
]
Would I fetch the JSON inside my Groceries object and then create an array and push each grocery item to the array? And then in my controller would I create my Groceries object and call initWithJSON and then have my Groceries.groceryList array list which was filled with the JSON data. Then populate the UITable in my view with my controller?
What's the industry standard way of approaching this method?

AfNetworking has a AFJSONSerializer so you can have a parser class that receive data from afnetworking, iterates over all data received and sends each grocery json to Grocery class, which can have an update method to check if this grocery exist, and updates it, and a insert method for new groceries.

Related

Re-arrange a CakePHP JSON response on object prior to render

I am working on an CakePHP REST/CRUD based API. It uses $routes->setExtensions(['json']); within /config/routes.php to make the it's responses into json objects.
I am working with several objects that have complex schema's that I need to pre-process prior to to submitting to the CakeORM, in order to simplify the API integration for the end user.
For instance the following is the json blob that would be needed to be patched to the ORM using $this->ImportSettings->patchEntity($importSetting, $requestData[2]):
{
"id": 2,
"generic_setting": "Hello World",
"import_source_google_setting": null,
"import_source_csv_ordered_setting": null,
"import_source_csv_headed_setting": {
"id": 1,
"import_settings_id": 2,
"delimiterId": 1
},
"import_destination_user_setting": null,
"import_destination_asset_setting": {
"id": 2,
"import_settings_id": 2,
"defaultValueId": 1
}
}
There can be one of many sources and one of many designation settings defined on an import setting. To simplify this for the API user I am allowing them to submit, the following:
{
"id": 2,
"generic_setting": "Hello World",
"import_source_setting": {
"id": 1,
"import_settings_id": 2,
"delimiterId": 1
},
"import_destination_setting": {
"id": 2,
"import_settings_id": 2,
"defaultValueId": 1
}
}
I have written code into an event listener on beforeMarshal for ImportSesttings that is able to tell if the index "import_source_setting" belongs in in the tables "import_source_csv_headed_setting", "import_source_csv_ordered_setting" or "import_source_google_setting" and likewise with asset and user settings going into "import_destination_setting".
This works well for processing a re-organizing data in a request before it enters the ORM. However I would like to do the same thing now with the data before it is displayed, so the API user does not need to look at the addtional source and destination settings.
I have accomplished this through the use of middleware in a similar use case in another part of the system. However the middleware seems to be made to attach to routes, my uses seems more like something that should be tied to the model life cycle so it runs whenever an import settings is returned and properly modifies the output, even when nested.
Given what I am looking for, what part of Cake should I place this logic that re-organizes the json response on the ORM query result for a table in? Can your point me to documentation on this?
I came across an answer for this in another forum using CakePHP's calculated fields. It looks like the formatResults() function can be attached to a query with a callback to re-organize the results after the query is ran. I went ahead and attached it to the query in the beforeFind() event, which seems to work.
See example below:
<?php
class ImportSettingsListener implements Cake\Event\EventListenerInterface
{
public function implementedEvents(): array
{
return [
'Model.beforeFind' => 'generateQuery',
];
}
public function generateQuery(Event $event, Query $query, ArrayObject $options, bool $primary): void
{
$query->formatResults(function (CollectionInterface $results) {
return $results->map(function ($setting) {
// Re-format $setting here
return $setting;
});
});
}
}

How to implement API resource expansion with PHP and MySQL

There are tons of articles, blogs and API docs about REST API resource field expansion but none about how to implement an expansion in aspect of technique and data query in right way.
Simple example for a flat resource response:
GET /api/v1/customers
{
data: [
{
"id": "209e5d80-c459-4d08-b53d-71d11e216a5d",
"contracts": null
},
{
"id": "c641cb83-af29-485d-9be2-97925057e3b2",
"contracts": null
}
],
expandable: ["contract"]
}
Simple example for expanded resource:
GET /api/v1/customers?expand=contract
{
data: [
{
"id": "209e5d80-c459-4d08-b53d-71d11e216a5d",
"contracts": [
{......},
{......},
{......},
]
},
{
"id": "c641cb83-af29-485d-9be2-97925057e3b2",
"contracts": [
{......},
{......},
{......},
]
}
],
expandable: ["contract"]
}
Lets assume we use a api rest controller class which handles the enpoints and a read service (maybe cqs/cqrs based) which uses plain sql for read performance. At which point does the expansion logic start and what is the right way of handling queries without an exponential increase in queries?
Afaik, this is not possible in one or few SQL queries (except the dirty way of GROUP_CONCAT and separation of all data into one field). Should I query all customers and then iterate over all customers and query expanded data for each customer? This would cause an exponential increase of queries.
I was looking for the same thing and I would say that the correct answer is it depends on the infrastructure used.
In a general manner, the implementation should receive the expand or expandable' object in your endpoint and respond acordingly. So, if we have something like your example, when requesting /api/v1/customersviaGETpassing theexpandableobject, you should run another query to the database to get, as in the example, theuser contracts`.
There is not a unique, one size fits all answer for this question, specially if we are not talking about a specific language + framework.

Custom map keys in GraphQL response

I've been looking into GraphQL as a replacement for some REST APIs of mine, and while I think I've wrapped my head around the basics and like most of what I see so far, there's one important feature that seems to be missing.
Let's say I've got a collection of items like this:
{
"id": "aaa",
"name": "Item 1",
...
}
An application needs a map of all those objects, indexed by ID as such:
{
"allItems": {
"aaa": {
"name": "Item 1",
...
},
"aab": {
"name": "Item 2",
...
}
}
}
Every API I've ever written has been able to give results back in a format like this, but I'm struggling to find a way to do it with GraphQL. I keep running across issue 101, but that deals more with unknown schemas. In my case, I know exactly what all the fields are; this is purely about output format. I know I could simply return all the items in an array and reformat it client-side, but that seems like overkill given that it's never been needed in the past, and would make GraphQL feel like a step backwards. I'm not sure if what I'm trying to do is impossible, or I'm just using all the wrong terminology. Should I keep digging, or is GraphQL just not suited to my needs? If this is possible, what might a query look like to retrieve data like this?
I'm currently working with graphql-php on the server, but I'm open to higher-level conceptual responses.
Unfortunately returning objects with arbitrary and dynamic keys like this is not really a first-class citizen in GraphQL. That is not to say you can't achieve the same thing, but in doing so you will lose many of the benefits of GraphQL.
If you are set on returning an object with id keys instead of returning a collection/list of objects containing the ids and then doing the transformation on the client then you can create a special GraphQLScalarType.
const GraphQLAnyObject = new GraphQLScalarType({
name: 'AnyObject',
description: 'Any JSON object. This type bypasses type checking.',
serialize: value => {
return value;
},
parseValue: value => {
return value;
},
parseLiteral: ast => {
if (ast.kind !== Kind.OBJECT) {
throw new GraphQLError("Query error: Can only parse object but got a: " + ast.kind, [ast]);
}
return ast.value;
}
});
The problem with this approach is that since it is a scalar type you cannot supply a selection set to query it. E.G. if you had a type
type MyType implements Node {
id: ID!
myKeyedCollection: AnyObject
}
Then you would only be able to query it like so
query {
getMyType(id: abc) {
myKeyedCollection # note there is no { ... }
}
}
As others have said, I wouldn't recommend this because you are losing a lot of the benefits of GraphQL but it goes to show that GraphQL can still do pretty much anything REST can.
Hope this helps!

Converting JSON to Doctrine Entity in Symfony

I'm currently working on building an API with the Symfony framwork. I've done enough reading to know to use the Serialization component, and built some custom normalizers for my entities. The way it currently works is:
JSON -> Array(Decode) -> User Entity(Denormalize)
This was working find as long as the request content was a JSON representation of the user, example:
{
"email": "demouser#email.com",
"plainPassword": "demouser",
"first_name" : "Demo",
"last_name" : "User"
}
A user entity is created using the following code in my controller:
$newuser = $this->get('api.serializer.default')->deserialize($request->getContent(), WebsiteUser::class, 'json');
However, I'd like to nest the user JSON in the 'data' property of a JSON object, which will allow consumers to pass additional metadata with the request, example:
{
"options": [
{
"foo": "bar"
}
],
"data": [
{
"email": "demouser#email.com",
"plainPassword": "demouser",
"first_name": "Demo",
"last_name": "User"
}
]
}
The main issue this causes is that the deserialization does not succeed because the JSON format has changed.
The only solution I've considered so far is to json_decode the whole request body, grab the 'data' element of that array, and pass the contents of the data element to the denormalizer (instead of the deserializer).
Is there a better way to solve this problem?
You should be able to get a specific key of your request body like follows:
$newuser = $this->get('api.serializer.default')->deserialize(
$request->request->get('data'), WebsiteUser::class, 'json'
);
If you are not able to retrieve the data from key without decoding your request body, look at this bundle, it consists in only one EventListener that replaces the request body after decode it.
You can easily integrate the same logic in your application, or requiring the bundle directly (which works well).

One object | Design considerations

I view my PHP code and JS code as one cohesive unit. I want to begin there interaction by creating an object on the client that looks like the structure below.
By doing this I only have to pass around one object. Sometimes all of the fields are populated, sometimes only 2 or more of the fields are populated.
So by trading off some wasted object properties, I only have to concern myself with passing o_p to different modules with in the MVC on the client and server.
I have functions to convert JavaScript to JSON to PHP.
Is this a valid approach?
Mo.o_p = function (type) {
return {
// current result or data about the data
result : 0,
// send client data
client : {
model : type,
page : {},
args : {}
},
// returned server data
server : {
bookmarks : {},
tweets : {},
smalls : {}
}
};
};
If your model requires these attributes and being empty is an important information for your application, i see no problem there. On the other hand, if your client and server objects are not necessarily connected and handled by different processes, there would be no need to couple them. Just passing some empty attributes should not be a performance problem.

Categories