I'm following the advice given here in order to find partial words with elasticsearch:
ElasticSearch n-gram tokenfilter not finding partial words
I've created a simple bash script that attempts to run a version of this:
curl -XDELETE 10.160.86.134:9200/products
curl -XPOST 10.160.86.134:9200/products -d '{
"index": {
"number_of_shards": 1,
"analysis": {
"filter": {
"mynGram" : {"type": "nGram", "min_gram": 2, "max_gram": 10}
},
"analyzer": {
"a1" : {
"type":"custom",
"tokenizer": "standard",
"filter": ["lowercase", "mynGram"]
}
}
}
}
}
}'
curl -XPUT 10.160.86.134:9200/products/_mapping -d '{
"product" : {
"index_analyzer" : "a1",
"search_analyzer" : "standard",
"properties" : {
"product_description": {"type":"string"},
"product_name": {"type":"string"}
}
}
}'
Following running this script the first two commands (dumping products, then setting the index) seem to work giving me this:
{"ok":true,"acknowledged":true}
{"ok":true,"acknowledged":true}
Then it errors out following the mapping call giving me this:
{"error":"ActionRequestValidationException[Validation Failed: 1: mapping type is missing;]","status":500}
Can anyone see what I'm doing wrong? Searching google starts autocompleting "mapping not found elasticsearch" so it seems to be a very common error.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…