Elastic search geo point query filter - php

I am trying to use geo_point for distance but it always shows location type double not geo_point how can I set location mapped to geo_point .
Actually I have to find all records within 5km sorted.
"pin" : {
"properties" : {
"location" : {
"properties" : {
"lat" : {
"type" : "double"
},
"lon" : {
"type" : "double"
}
}
},
"type" : {
"type" : "string"
}
}
},
and when I am trying searches with this query below to find result within 5km of delhi lat long :
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"pin": {
"distance": "5",
"distance_unit": "km",
"coordinate": {
"lat": 28.5402707,
"lon": 77.2289643
}
}
}
}
}
}
It shows me error query_parsing_exception and No query registered for [pin]. I cannot figure out the problem. It always throws this exception
{
"error": {
"root_cause": [
{
"type": "query_parsing_exception",
"reason": "No query registered for [pin]",
"index": "find_index",
"line": 1,
"col": 58
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "find_index",
"node": "DtpkEdLCSZCr8r2bTd8p5w",
"reason": {
"type": "query_parsing_exception",
"reason": "No query registered for [pin]",
"index": "find_index",
"line": 1,
"col": 58
}
}
]
},
"status": 400
}
Please help me to figure out this problem. how can I set geo_point and solve this exception error and status 400 and all_shards_failed error

You simply need to map your pin type like this, i.e. using the geo_point data type:
# 1. first delete your index
DELETE shouut_find_index
# 2. create a new one with the proper mapping
PUT shouut_find_index
{
"mappings": {
"search_shouut": {
"properties": {
"location": {
"type": "geo_point"
},
"type": {
"type": "string"
}
}
}
}
}
Then you can index a document like this
PUT shouut_find_index/search_shouut/1
{
"location": {
"lat": 28.5402707,
"lon": 77.2289643
},
"type": "dummy"
}
And finally your query can look like this
POST shouut_find_index/search_shouut/_search
{
"query": {
"filtered": {
"filter": {
"geo_distance": {
"distance": "5km",
"location": {
"lat": 28.5402707,
"lon": 77.2289643
}
}
}
}
}
}

Related

Searching for exact phrase with synonyms

I am trying to build a query, where I am using exact phrase match and synonyms and I can't figure it out. Also, when using wildcard approach I don't know how to use fuzziness. Is it even possible with wildcards? It would be great to get same results for terms "call of duty", "cod" or "call of dutz".
I have created this index:
PUT exact_search
{
"settings": {
"index": {
"number_of_shards": "1",
"number_of_replicas": "0",
"analysis": {
"analyzer": {
"analyzer_exact": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase",
"icu_folding",
"synonyms"
]
}
},
"filter": {
"synonyms": {
"type": "synonym",
"synonyms_path": "synonyms.txt"
}
}
}
}
},
"mappings": {
"properties": {
"name": {
"type": "keyword",
"fields": {
"analyzer_exact": {
"type": "text",
"analyzer": "analyzer_exact"
}
}
}
}
}
}
And I fill it with these items:
POST exact_search/_doc/1
{
"name": "Hoodie Call of Duty"
}
POST exact_search/_doc/2
{
"name": "Call of Duty 2"
}
POST exact_search/_doc/3
{
"name": "Call of Duty: Modern Warfare 2"
}
POST exact_search/_doc/4
{
"name": "COD: Modern Warfare 2"
}
POST exact_search/_doc/5
{
"name": "Call of duty"
}
POST exact_search/_doc/6
{
"name": "Call of the sea"
}
POST exact_search/_doc/7
{
"name": "Heavy Duty"
}
synonyms.txt looks like this:
cod,call of duty
And what I am trying to achieve is, to get all the results (exept call of the sea and heavy duty) when I search "call of duty" or "cod".
So far, I constructed this query, but it does not work as expected when using "cod" search term (term "call of duty" works fine):
GET exact_search/_search
{
"explain": false,
"query":{
"bool":{
"must":[
{
"wildcard": {
"name.analyzer_exact": {
"value": "*cod*"
}
}
}
]
}
}
}
But the result is only two items:
{
"took" : 2,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 2,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "exact_search",
"_id" : "4",
"_score" : 1.0,
"_source" : {
"name" : "COD: Modern Warfare 2"
}
},
{
"_index" : "exact_search",
"_id" : "5",
"_score" : 1.0,
"_source" : {
"name" : "Call of duty"
}
}
]
}
}
It looks like that the synonyms are working, because it returns "call of duty" game, but it ignores the wildcards - it won't return Call of Duty 2 for example.
I need to look for the exact phrase match, because I dont't want to get results Heavy Duty or Call of the sea (when words "call" and "duty" match).
Thank you for pointing me in the right direction.
I have my doubts if the analyzer would generate the tokens synonymous with the analyzer_exact "tokenizer": "keyword".
I would change a few things to make it work.
keyword -> standard
"analyzer_exact": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"synonyms"
]
}
I would use match phrase to eliminate names other than call of duty and cod.
{
"match_phrase": {
"name.analyzer_exact": "cod"
}
}
Response after changes
{
"hits": {
"hits": [
{
"_source": {
"name": "Call of duty"
}
},
{
"_source": {
"name": "COD: Modern Warfare 2"
}
},
{
"_source": {
"name": "Call of Duty 2"
}
},
{
"_source": {
"name": "hoddies Call of Duty"
}
},
{
"_source": {
"name": "Call of Duty: Modern Warfare 2"
}
}
]
}

Delete document older than One hour is not working in elasticsearch

I am new to elasticsearch and I have an document in elasticsearch and document contain thosands of user views and now I want to delete those view that are older than 3Hours for this purpose I write following query in elasticsearch
POST {INDEX}/_delete_by_query
{
"query": {
"bool": {
"must": [
{
"term": {
"type": "box_views"
}
},
{
"query": {
"range": {
"#created_at": {
"gte": "now-3h"
}
}
}
}
]
}
}
}
When I execute this query I receive following error
{ "error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "no [query] registered for [query]",
"line": 1,
"col": 66
}
],
"type": "parsing_exception",
"reason": "no [query] registered for [query]",
"line": 1,
"col": 66 }, "status": 400 }
Your query should look like this:
POST {INDEX}/_delete_by_query
{
"query": {
"bool": {
"must": [
{
"term": {
"type": "box_views"
}
},
{
"range": {
"#created_at": {
"gte": "now-3h"
}
}
}
]
}
}
}
Besides, if you're looking for older documents, I think you should use lte instead of gte.

Elastisearch Failed to JSON encode issue

I working on elastic search and I have 1K phone numbers when I pass this phone numbers array to elastic search to search users through phone numbers it gives me exception
Failed to JSON encode /var/app/current/vendor/elasticsearch/elasticsearch/src/Elasticsearch/Serializers/SmartSerializer.php
Below is my Elasticsearch client initializing
$client = ClientBuilder::create()->setHosts([$host])->build();
And my working query in Elasticsearch
{
"_source": [
"id"
],
"query": {
"bool": {
"must": [
{
"term": {
"type": "user"
}
},
{
"bool": {
"should": [
{
"prefix": {
"phone": {
"value": "923047698099"
}
}
},
{
"prefix": {
"phone": {
"value": "92313730320"
}
}
},
.
.
.
]
}
}
],
"must_not": [
{
"has_child": {
"type": "blocked",
"query": {
"term": {
"user_id": "u-2"
}
}
}
},
{
"has_child": {
"type": "block",
"query": {
"term": {
"user_id": "u-2"
}
}
}
},
{
"term": {
"db_id": 2
}
}
]
}
}
}
I don't know that where I doing mistake. Either at client initializing or writing elasticserch query. I searched this issue but not usefull solution found or might be I did't understand clearly. But still I am stucked on this issue that how to solve this problem. Suggest any usefull link or solution.
Thanks

geo_point for location field within array field

I have data in elasticsearch like this
"id": "edff12sd3"
"main_array": [
{
"id": "2308",
"name": "Grey Area",
"location": {
"lat": 28.5696577,
"lon": 77.3229933
}
}
,
{
"id": "2274",
"name": "Tribute to The Beatles by Atul Ahuja- Live Music",
"location": {
"lat": 29.5696577,
"lon": 77.3229933
}
}
Now i want to set geo_point for location field. I have tried in this way
{
"mappings": {
"search_data": {
"properties": {
"main_array.location": {
"type": "geo_point"
}
}
}
}
}
but it throws me error
"type": "mapper_parsing_exception",
"reason": "Field name [main_array.location] cannot contain '.'"
can you please help me out. thanks
location is a property of the objects you store in main_array, so try like this:
{
"mappings": {
"search_data": {
"properties": {
"main_array": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}
}
}
Note that as of ES 2.0 field names may not contain dots

Elasticsearch sorting only by alphabetical not by numeric

I am having problem with sorting in PHP, here is my mapping:
{
"jdbc": {
"mappings": {
"jdbc": {
"properties": {
"admitted_date": {
"type": "date",
"format": "dateOptionalTime"
},
"et_tax": {
"type": "string"
},
"jt_tax": {
"type": "string"
},
"loc_cityname": {
"type": "string"
},
"location_countryname": {
"type": "string"
},
"location_primary": {
"type": "string"
},
"pd_firstName": {
"type": "string"
}
}
}
}
}
}
When I use order the result by sort, it will order the results with alphanumeric, it will load the results with numeric as first. I need to order the results only starting letter alphabets. Now it orders like this:
http://localhost:9200/jdbc/_search?pretty=true&sort=pd_lawFirmName:asc
BM&A
Gomez-Acebo & Pombo
Addleshaw Goddard
How to order the results like this?
Addleshaw Goddard
BM&A
Gomez-Acebo & Pombo
Here is my query i using for indexing
{
"type" : "jdbc",
"jdbc" : {
"driver" : "com.mysql.jdbc.Driver",
"url" : "jdbc:mysql://localhost:3306/dbname",
"user" : "user",
"password" : "pass",
"sql" : "SQL QUERY",
"poll" : "24h",
"strategy" : "simple",
"scale" : 0,
"autocommit" : true,
"bulk_size" : 5000,
"max_bulk_requests" : 30,
"bulk_flush_interval" : "5s",
"fetchsize" : 100,
"max_rows" : 149669,
"max_retries" : 3,
"max_retries_wait" : "10s",
"locale" : "in",
"digesting" : true,
"mappings": {
"sorting": {
"properties": {
"pd_lawFirmName": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
}
}
This is like that because Elasticsearch will tokenize the text using the default analyzer, which is standard. For example, McDermott Will Amery is indexed like:
"amery",
"mcdermott",
"will"
If you want to sort like that, I would suggest to change the mapping of your pd_lawFirmName in something like this:
"pd_lawFirmName": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
and sort by the raw subfield:
http://localhost:9200/jdbc/_search?pretty=true&sort=pd_lawFirmName.raw:asc

Categories