DynamoDB updateItem creates new record - php

Its updating data if session is exists fine.
But when session is not exists its create a new row.At this stage , i dont want to do any action.
$sessionId = $params['session_id'];
$response = $this->dbo->updateItem([
'TableName' => $this->tableName,
'Key' => [
'id' => [ 'S' => $sessionId]
],
'ExpressionAttributeValues' => [
':val1' => ['S' => date('Y-m-d H:i:s')]
],
'UpdateExpression' => 'set sessionEnd = :val1',
'attributeExists' => 'id',
'ReturnValues' => "UPDATED_NEW"
]);

The update item API would create the new item if it doesn't exist. You can use ConditionExpression to stop creating the new item.
Edits an existing item's attributes, or adds a new item to the table
if it does not already exist.
This means only if the id is already present, the update operation will happen. Also, it will stop creating the new item.
'ConditionExpression' => 'attribute_exists(id)',
If the key is not found, the API throws the below exception.
"code": "ConditionalCheckFailedException",

Related

How to add value into existing array without replacing exists array data? [duplicate]

Here I used updateOrCreate method to post data but when I use this method old data replaced by new data but I want to add new data without updating or replacing exists data.
here is the code for insert data
$booking = Bookings::updateOrCreate(
['schedules_id' => $schedules_id], // match the row based on this array
[ // update this columns
'buses_id' => $buses_id,
'routes_id' => $routes_id,
'seat' => json_encode($seat),
'price' => $request->price,
'profile' => 'pending',
]
);
I solved myself, I stored old data into $extSeat then merge with new data, I don't know the logic is correct or wrong but works
$extSeat = DB::table('bookings')->select('seat')->first();
$extSeat = explode(",", $extSeat->seat);
$booking = Bookings::updateOrCreate(
['schedules_id' => $schedules_id],// row to test if schedule id matches existing schedule id
[ // update this columns
'buses_id' => $buses_id,
'routes_id' => $routes_id,
'seat' => implode(",", array_merge($seat,$extSeat )),
'price' => $request->price,
'profile' => 'pending',
]);

How to insert new data into exists row without replace exists data in laravel?

Here I used updateOrCreate method to post data but when I use this method old data replaced by new data but I want to add new data without updating or replacing exists data.
here is the code for insert data
$booking = Bookings::updateOrCreate(
['schedules_id' => $schedules_id], // match the row based on this array
[ // update this columns
'buses_id' => $buses_id,
'routes_id' => $routes_id,
'seat' => json_encode($seat),
'price' => $request->price,
'profile' => 'pending',
]
);
I solved myself, I stored old data into $extSeat then merge with new data, I don't know the logic is correct or wrong but works
$extSeat = DB::table('bookings')->select('seat')->first();
$extSeat = explode(",", $extSeat->seat);
$booking = Bookings::updateOrCreate(
['schedules_id' => $schedules_id],// row to test if schedule id matches existing schedule id
[ // update this columns
'buses_id' => $buses_id,
'routes_id' => $routes_id,
'seat' => implode(",", array_merge($seat,$extSeat )),
'price' => $request->price,
'profile' => 'pending',
]);

Dynamodb update table data not working

I want to update data using 'typeId', 'type_id' is not a primary key.
While this code is work, if we use other primary key.
Unable to update record.
getting following error :
{"__type":"com.amazon.coral.validate#ValidationException","message":"The provided key element does not match the schema"}
$response = $this->dbo->updateItem([
'TableName' => $this->tableName,
'Key' => [
'typeId' => ['S' => "qtwr234"]
],
'ExpressionAttributeValues' => [
':val1' => ['N' => '1']
],
'UpdateExpression' => 'set count = :val1',
'ReturnValues' => 'ALL_NEW'
]);
According the error message you're receiving
This error occurs when your key (hash / primary key) doesn't not match in parameters your passing to update the data in table.
Solution :
Run listTables commands and check for key element you created while creating elements.
Now, replace that key element in query parameters.
Thanks

Elasticsearch search delays pulling latest data after indexing

I am using the Official PHP driver to connect to Elasticsearch(v 2.3), every when I index a new document it takes from 5sec to 60sec to be able to get it into my filter results. How can I cut down the delay time to zero?
Here is my index query
# Document Body
$data = [];
$data['time'] = $time;
$data['unique'] = 1;
$data['lastACtivity'] = $time;
$data['bucket'] = 20,
$data['permission'] = $this->_user->permission; # Extracts User Permission
$data['ipaddress'] = $this->_client->ipaddress(); # Extracts User IP Address
# Construct Index
$indexRequest = [
'index' => 'gorocket',
'type' => 'log',
'refresh' => true,
'body' => $data
];
# Indexing Document
$confirmation = $client->index( $indexRequest );
And here is my search filter query
# Query array
$query =[ 'query' => [
'filtered' => [
'filter' => [
'bool' => [
'must' =>[
[
'match' => [ 'unique' => 1 ]
],
[
'range' => [
'lastACtivity' => [
'gte' => $from,
'lte' => $to
],
'_cache' => false
]
]
],
'must_not' => [
[ 'match' => [ 'type' => 'share' ] ],
]
]
]
]
]
];
# Prepare filter parameters
$filterParams = [
'index' => 'gorocket',
'type' => 'log',
'size' => 20,
'query_cache' => false,
'body' => $query
];
$client->search($filterParams);
Thank you.
When you index a new document you can specify the refresh parameter in order to make the new document available immediately for your next search operation.
$params = [
'index' => 'my-index',
'type' => 'my-type',
'id' => 123,
'refresh' => true <--- add this
];
$response = $client->index($params);
The refresh parameter is also available on the bulk operation if you're using it.
Be aware, though, that refreshing too often can have negative impacts on performance.
There is a refresh option provided, which needs a value (in seconds) to refresh the index. For example, if you update something in index, it gets written in the index but not ready for reading until the index is refreshed.
Refresh can be set to true for refreshing the index as soon as any change happens. This needs to be very carefully thought, because many times, it downgrades your performance as its an overkill to refresh for each small operation, plus many bulk refreshes can make the index busy.
Tip: Use an elasticsearch plugin, such as kopf and see more such options like refresh rate, to configure.

Using elasticsearch, how to create an index for a document that contains an array, and append to that array in the future

In my example code I am using the php client library, but it should be understood by anyone familiar with elasticsearch.
I'm using elasticsearch to create an index where each document contains an array of nGram indexed authors. Initially, the document will have a single author, but as time progresses, more authors will be appended to the array. Ideally, a search could be executed by an author's name, and if any of the authors in the array get matched, the document will be found.
I have been trying to use the documentation here for appending to the array and here for using the array type - but I have not had success getting this working.
First, I want to create an index for documents, with a title, array of authors, and an array of comments.
$client = new Client();
$params = [
'index' => 'document',
'body' => [
'settings' => [
// Simple settings for now, single shard
'number_of_shards' => 1,
'number_of_replicas' => 0,
'analysis' => [
'filter' => [
'shingle' => [
'type' => 'shingle'
]
],
'analyzer' => [
'my_ngram_analyzer' => [
'tokenizer' => 'my_ngram_tokenizer',
'filter' => 'lowercase',
]
],
// Allow searching for partial names with nGram
'tokenizer' => [
'my_ngram_tokenizer' => [
'type' => 'nGram',
'min_gram' => 1,
'max_gram' => 15,
'token_chars' => ['letter', 'digit']
]
]
]
],
'mappings' => [
'_default_' => [
'properties' => [
'document_id' => [
'type' => 'string',
'index' => 'not_analyzed',
],
// The name, email, or other info related to the person
'title' => [
'type' => 'string',
'analyzer' => 'my_ngram_analyzer',
'term_vector' => 'yes',
'copy_to' => 'combined'
],
'authors' => [
'type' => 'list',
'analyzer' => 'my_ngram_analyzer',
'term_vector' => 'yes',
'copy_to' => 'combined'
],
'comments' => [
'type' => 'list',
'analyzer' => 'my_ngram_analyzer',
'term_vector' => 'yes',
'copy_to' => 'combined'
],
]
],
]
]
];
// Create index `person` with ngram indexing
$client->indices()->create($params);
Off the get go, I can't even create the index due to this error:
{"error":"MapperParsingException[mapping [_default_]]; nested: MapperParsingException[No handler for type [list] declared on field [authors]]; ","status":400}
HAD this gone successfully though, I would plan to create an index, starting with empty arrays for authors and title, something like this:
$client = new Client();
$params = array();
$params['body'] = array('document_id' => 'id_here', 'title' => 'my_title', 'authors' => [], 'comments' => []);
$params['index'] = 'document';
$params['type'] = 'example_type';
$params['id'] = 'id_here';
$ret = $client->index($params);
return $ret;
This seems like it should work if I had the desired index to add this structure of information to, but what concerns me would be appending something to the array using update. For example,
$client = new Client();
$params = array();
//$params['body'] = array('person_id' => $person_id, 'emails' => [$email]);
$params['index'] = 'document';
$params['type'] = 'example_type';
$params['id'] = 'id_here';
$params['script'] = 'NO IDEA WHAT THIS SCRIPT SHOULD BE TO APPEND TO THE ARRAY';
$ret = $client->update($params);
return $ret;
}
I am not sure how I would go about actually appending a thing to the array and making sure it's indexed.
Finally, another thing that confuses me is how I could search based on any author in the array. Ideally I could do something like this:
But I'm not 100% whether it will work. Maybe there is something fundemental about elasticsearch that I am not understanding. I am completely new to so any resources that will get me to a point where these little details don't hang me up would be appreciated.
Also, any direct advice on how to use elasticsearch to solve these problems would be appreciated.
Sorry for the big wall of text, to recap, I am looking for advice on how to
Create an index that supports nGram analysis on all elements of an array
Updating that index to append to the array
Searching for the now-updated index.
Thanks for any help
EDIT: thanks to #astax, I am now able to create the index and append to the value as a string. HOWEVER, there are two problems with this:
the array is stored as a string value, so a script like
$params['script'] = 'ctx._source.authors += [\'hello\']';
actually appends a STRING with [] rather than an array containing a value.
the value inputted does not appear to be ngram analyzed, so a search like this:
$client = new Client();
$searchParams['index'] = 'document';
$searchParams['type'] = 'example_type';
$searchParams['body']['query']['match']['_all'] = 'hello';
$queryResponse = $client->search($searchParams);
print_r($queryResponse); // SUCCESS
will find the new value but a search like this:
$client = new Client();
$searchParams['index'] = 'document';
$searchParams['type'] = 'example_type';
$searchParams['body']['query']['match']['_all'] = 'hel';
$queryResponse = $client->search($searchParams);
print_r($queryResponse); // NO RESULTS
does not
There is no type "list" in elasticsearch. But you can use "string" field type and store array of values.
....
'comments' => [
'type' => 'string',
'analyzer' => 'my_ngram_analyzer',
'term_vector' => 'yes',
'copy_to' => 'combined'
],
....
And index a document this way:
....
$params['body'] = array(
'document_id' => 'id_here',
'title' => 'my_title',
'authors' => [],
'comments' => ['comment1', 'comment2']);
....
As for the script for apending an element to array, this answer may help you - Elasticsearch upserting and appending to array
However, do you really need to update the document? It might be easier to just reindex it as this is exactly what Elasticsearch does internally. It reads the "_source" property, does the required modification and reindexes it. BTW, this means that "_source" must be enabled and all properties of the document should be included into it.
You also may consider storing comments and authors (as I understand these are authors of comments, not the document authors) as child document in ES and using "has_child" filter.
I can't really give you specific solution, but strongly recommend installing Marvel plugin for ElasticSearch and use its "sense" tool to check how your overall process works step by step.
So check if your tokenizer is properly configured by running tests as described at http://www.elastic.co/guide/en/elasticsearch/reference/1.4/indices-analyze.html.
Then check if your update script is doing what you expect by retrieving the document by running GET /document/example_type/some_existing_id
The authors and comments should be arrays, but not strings.
Finally perform the search:
GET /document/_search
{
'query' : {
'match': { '_all': 'hel' }
}
}
If you're building the query yourself rather than getting it from the user, you may use query_string with placeholders:
GET /document/_search
{
'query' : {
'query_string': {
'fields': '_all',
'query': 'hel*'
}
}
}

Categories