I have read about json_encode but still lack the logic in using it for my needs on this particular JSON structure.
Assuming the JSON structure is as follows :
{
"_id": "23441324",
"api_rev": "1.0",
"type": "router",
"hostname": "something",
"lat": -31.805412,
"lon": -64.424677,
"aliases": [
{
"type": "olsr",
"alias": "104.201.0.29"
}
],
"site": "roof town hall",
"community": "Freifunk/Berlin",
"attributes": {
"firmware": {
"name": "meshkit",
"url": "http:k.net/"
}
}
}
Some of the values of the attributes will be taken from the database while some are going to be hardcoded(static) like "type","api_rev". I was thinking of just using concatenation to build the structure but learnt its a bad idea. So if i am to use json_encode how may i be able to handle this structure ? array dimensions etc.
Related
Does anybody have experience using Strapi.io and combine it with Codeigniter.
I have difficult when i need to make JSON in my controller same like strapi needed.
My JSON look like below
string(102) "{"id":"3","Name":"Paket Anak-anak","Description":"Loren ipsum","Price":"11000","Image":"Capture3.PNG"}"
Whilist what Strapi need to POST request
kind of like below
{
"data": [
{
"id": 2,
"attributes": {
"Name": "Paket Medium",
"Description": "Loren ipsum qqiqqfqqv",
"Price": 120000,
"Image": null,
"createdAt": "2022-09-17T05:44:18.713Z",
"updatedAt": "2022-09-17T05:44:20.723Z",
"publishedAt": "2022-09-17T05:44:20.720Z"
}
}
],
}
In DynamoDB i have a table with the following structure.
The actions "field" contains all the info (and this is the field i would like to search into) and orderId it's the primary key
{
"actions": [
{
"actionDescription": "8f23029def1d6baa4",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533730680,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "21857e61037bc29ec",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731788,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "cf10abd44e24cef56",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731788,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "7787fe7a5bf4d22de",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731789,
"user": {
"fullName": "OOOOOO",
"userName": "ooooo#oooo.ooo",
}
},
{
"actionDescription": "9528c439021f504bf",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731789,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "bfba100e0e54934b2",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731789,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "f789dc12f1dbe3be2",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731789,
"user": {
"fullName": "OOOOOO",
"userName": "ooooo#oooo.ooo",
}
},
{
"actionDescription": "4cd6b68dfea7cf8ee",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731789,
"user": {
"fullName": "XXXXX",
"userName": "xxxxx#xxxx.xxx",
}
},
{
"actionDescription": "1e3a0e95f8e5106d7",
"actionTitle": "UNDEFINED_ACTION",
"timestamp": 1533731790,
"user": {
"fullName": "OOOOOO",
"userName": "ooooo#oooo.ooo",
}
}
],
"orderId": "13aae31"
}
What i would like to do it's to make the scan terms in PHP to be able to search by userName. or by any field inside the actions array (timestamp, actionTitle, etc, etc).
Bellow it's one of the many terms i tried to use but i was unable to achieve any results
$params = [
'TableName' => $this->tableName,
'FilterExpression' => "userName = :searchTerm",
'ExpressionAttributeValues' => [
':searchTerm' => 'ooooo#oooo.ooo',
],
'ReturnConsumedCapacity' => 'TOTAL',
];
$results = $this->dynamoDbClient->scan($params);
Can you please guide my by telling me what i'm missing?
Also, please note: I don't want to get a specific orderId, i would like to get ALL orderIds containing the searchTerm (in this case userName)
Your best bet with this item schema is to filter the table items yourself. That is to say, scan the table with no filter expression and write your own code to filter the results. Scanning without the filter expression will consume the same amount of read capacity units.
You can set the filter expression to something like this, however this isn't scalable and only works if you have a fixed number of items in the actions list.
actions[0].user.userName == :searchTerm OR actions[1].user.userName == :searchTerm OR actions[2].user.userName == :searchTerm OR ....
If you need complex search abilities you are probably better off using a dedicated search database. AWS provides two services around this, AWS CloudSearch and AWS ElasticSearch. You can use DynamoDB streams to keep your search indexes up to date.
If you are set on scanning the DynamoDB table with a filter you can refactor your structure to include additional attributes that have all the searchable information in a set (or concatenated string)
{
"actions": [....],
"actionsDescriptions": Set["8f23029def1d6baa4", "21857e61037bc29ec", "cf10abd44e24cef56", "7787fe7a5bf4d22de", "9528c439021f504bf", "bfba100e0e54934b2", "f789dc12f1dbe3be2", "4cd6b68dfea7cf8ee", "1e3a0e95f8e5106d7"],
"actionTitles": Set["UNDEFINED_ACTION"],
"timestamps": Set[1533730680, 1533731788, 1533731789, 1533731790],
"user_fullNames": Set["XXXXX"],
"user_userNames": Set["ooooo#oooo.ooo", "xxxxx#xxxx.xxx"],
"orderId": "13aae31"
}
Notice you have to use a Set (or concatenate all the values into a string) since the contains functions only works on strings and sets.
Then you can use a filter expression like this
contains(user_userNames, :searchTerm)
The DynamoDB QueryFilter and ScanFilter options do not currently support the CONTAINS operator for maps. You'll need to build another lookup table indexed by userName to avoid scanning the entire table.
E.g. new table schema:
{
"userName": "xxxxx#xxxx.xxx"
"orderId": "13aae31"
}
Where the hash key is userName and orderId is the ID of an order in the other table.
The closest you can get with the current schema is to use #cementblocks's suggestions to scan the whole table and filter application-side or query each element in the list individually.
If you are adding a "Search" like feature to your application, then scanning may not be the best approach.
DynamoDB scan can be expensive and slow, especially when you have many rows.
So, if you intend on adding a "Search" feature you may consider using AWS CloudSearch. It is a scalable "Search" feature. You can quickly enable "Search" from a DynamoDB table.
I am getting json array after getting applying query logic.
[
{
"id": "3",
"diag_name": "LT Diagnostics",
"test_name": "Alk PO4",
"booking_date": "2018-05-20"
},
{
"id": "3",
"diag_name": "LT Diagnostics",
"test_name": "CRP",
"booking_date": "2018-05-20"
},
{
"id": "4",
"diag_name": "Seepz Diagnostics",
"test_name": "Alk PO4",
"booking_date": "2018-05-21"
}
]
But i want a more justified json array written below.
[
{
"diag_name": "LT Diagnostics",
"test_name": [
{
"id": "3",
"name" : "Alk PO4"
},
{
"id": "3",
"name" : "CRP"
}
],
"booking_date": "2018-05-20"
},
{
"diag_name": "Seepz Diagnostics",
"test_name": [
{
"id": "4",
"name" : "Alk PO4"
}
],
"booking_date": "2018-05-21"
},
]
I am not getting it,How to do in php. I want a more consolidate json format.
Have you tried changing your SQL query to group by diag_name and booking_date? That would be the first step I’d employ to get the outer data.
Formatting the data in the nested manner you’re after could be a function of whatever record serializer you’re using — does it support nested JSON as a return type, or only flat JSON as your example return value shows?
If the record set -> JSON serializer only ever returns flat data, the comments above are correct that you will have to write your own formatter to change the shape of the JSON yourself...
The accepted answer of this other question may be of help:
Create multi-level JSON with PHP and MySQL
I'm not a PHP guy but this is a typical scenario to use functional programming by means of the monad Map.
Looking online I've found this article that could help you.
Changing datasource output is not always (seldom indeed) a viable option.
Enjoy coding
So this is my first time using JSON Schema and I have a fairly basic question about requirements.
My top level schema is as follows:
schema.json:
{
"id": "http://localhost/srv/schemas/schema.json",
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"event": { "$ref": "events_schema.json#" },
"building": { "$ref": "buildings_schema.json#" }
},
"required": [ "event" ],
"additionalProperties": false
}
I have two other schema definition files (events_schema.json and buildings_schema.json) that have object field definitions in them. The one of particular interest is buildings_schema.json.
buildings_schema.json:
{
"id": "http://localhost/srv/schemas/buildings_schema.json",
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "buildings table validation definition",
"type": "object",
"properties": {
"BuildingID": {
"type": "integer",
"minimum": 1
},
"BuildingDescription": {
"type": "string",
"maxLength": 255
}
},
"required": [ "BuildingID" ],
"additionalProperties": false
}
I am using this file to test my validation:
test.json:
{
"event": {
"EventID": 1,
"EventDescription": "Some description",
"EventTitle": "Test title",
"EventStatus": 2,
"EventPriority": 1,
"Date": "2007-05-05 12:13:45"
},
"building": {
"BuildingID": 1,
}
}
Which passes validation fine. But when I use the following:
test2.json
{
"event": {
"EventID": 1,
"EventDescription": "Some description",
"EventTitle": "Test title",
"EventStatus": 2,
"EventPriority": 1,
"Date": "2007-05-05 12:13:45"
}
}
I get the error: [building] the property BuildingID is required
Inside my buildings_schema.json file I have the line "required": [ "BuildingID" ] which is what causes the error. It appears that the schema.json is traversing down the property definitions and enforcing all the requirements. This is counter intuitive and I would like it to ONLY enforce a requirement if it's parent property is enforced.
I have a few ways around this that involve arrays and fundamentally changing the structure of the JSON, but that kind of defeats the purpose of my attempts at validating existing JSON. I have read over the documentation (/sigh) and have not found anything relating to this issue. Is there a some simple requirement inheritance setting I am missing?
I am using the Json-Schema for PHP implementation from here: https://github.com/justinrainbow/json-schema
After messing with different validators, it appears to be an issue with the validator. The validator assumes required inheritance through references. I fixed this by simply breaking apart the main schema into subschemas and only using the required subschema when necessary.
I have following mongo document structure.
{
"geo": {
"0": {
"deliveryArea": {
"0": {
"lat": 50.449234773334,
"lng": 30.52300029031
},
"1": {
"lat": 50.449234773334,
"lng": 30.52542980606
},
"2": {
"lat": 50.45154573136,
"lng": 30.52542980606
},
"3": {
"lat": 50.45154573136,
"lng": 30.52300029031
}
},
"title": "Kiev, ....",
"coords": {
"lat": "50.4501",
"lngt": "30.523400000000038"
},
"wholeCityDelivery": "false"
}
Questions:
How to ensure_index for every element in deliveryArea [lat,lng] for geolocationg?
How to build query to find documents with the point N[lat,lng] in(belongs to) polygon deliveryArea.$
If you can do PHP - please)
Thanks!
First of all, you need to change your data structure. MongoDB wants "longitude, lattitude" and it doesn't care which names you give to the fields—in general, I would use them without the field names. You also need to store your document without the "0" keys
So like:
{
"geo": {
"deliveryArea": [
[ 30.52300029031, 50.449234773334 ],
]
}
}
Then you need to set a 2D index on "geo.deliveryArea":
$collection->ensureIndex( array( "geo.deliveryArea" => "2d" ) );
Information on how to build queries can be at http://www.mongodb.org/display/DOCS/Geospatial+Indexing#GeospatialIndexing-BoundsQueries However, in your case you want to find a delivery area for a point, instead of checking whether stored points fit in a given area, and MongoDB can not do that directly as its 2d index can only store points.
Please file a feature request at http://jira.mongodb.org