- Locate us
- success@ashnik.com
- SG: +65 64383504
- IN: 022 25771219
- IN: 022 25792714
- IN: +91 9987536436
Inside this ISSUE







Industry SAYS

Tips & Tricks
Tip of the Month
Elasticsearch Tokenizers - edge_ngram
The Tokenizers are used to split a string into a stream of tokens, gives a plus where your query will match better results while the user is providing more data.
The below example produces the following terms:
[as, ash, ashn, ashni, ashnik]
This ensure that partial words are available for matching in the index.
PUT index_name
{
"settings": {
"analysis": {
"analyzer": {
"my_analyzer": {
"tokenizer": "my_tokenizer"
}
},
"tokenizer": {
"my_tokenizer": {
"type": "ngram",
"min_gram": 2,
"max_gram": 6,
"token_chars": [
"letter",
"digit"
]
}
}
}
}
}
POST index_name/_analyze
{
"analyzer": "my_analyzer",
"text": "Ashnik"
}
- Technical Tip by Monika Agrawal | Solutions Consultant, Ashnik
The LIGHTER Side


Ashnik Network
