Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Elasticsearch Map case insensitive to not_analyzed documents

Tags:

I have a type with following mapping

PUT /testindex {     "mappings" : {         "products" : {             "properties" : {                 "category_name" : {                     "type" : "string",                     "index" : "not_analyzed"                  }             }         }     }  } 

I wanted to search for an exact word.Thats why i set this as not_analyzed. But the problem is i want to search that with lower case or upper case[case insensitive].

I searched for it and found a way to set case insensitive.

curl -XPOST localhost:9200/testindex -d '{   "mappings" : {     "products" : {       "properties" : {                 "category_name":{"type": "string", "index": "analyzed", "analyzer":"lowercase_keyword"}       }     }   } }' 

Is there any way to do these two mappings to same field.?

Thanks..

like image 616
user3683474 Avatar asked Jun 04 '14 11:06

user3683474


People also ask

How do you do a case insensitive search in Elasticsearch?

For this particular usecase we can use lowercase normalizer of elasticsearch. This will do two things - it'll analyze the string and will convert everything to lowercase and second it won't create multiple tokens out of the string, a single token will be produced.

Is Elasticsearch case insensitive?

Elasticsearch support both case sensitive & insensitive.

Is Elasticsearch field names case sensitive?

This would also help out by preventing field explosions and make querying and using the elastic api easier. Databases have a variety of sensitivities. SQL, by default, is case insensitive to identifiers and keywords, but case sensitive to data. JSON is case sensitive to both field names and data.

Why do we need to use mapping on an Elasticsearch index?

Mapping is the process of defining how a document and its fields are indexed and stored. It defines the type and format of the fields in the documents. As a result, mapping can significantly affect how Elasticsearch searches and stores data.


1 Answers

I think this example meets your needs:

$ curl -XPUT localhost:9200/testindex/ -d ' {   "settings":{      "index":{         "analysis":{            "analyzer":{               "analyzer_keyword":{                  "tokenizer":"keyword",                  "filter":"lowercase"               }            }         }      }   },   "mappings":{      "test":{         "properties":{            "title":{               "analyzer":"analyzer_keyword",               "type":"string"            }         }      }   } }' 

taken from here: How to setup a tokenizer in elasticsearch

it uses both the keyword tokenizer and the lowercase filter on a string field which I believe does what you want.

like image 115
John Petrone Avatar answered Sep 30 '22 21:09

John Petrone