2

我有一个令牌过滤器和分析器,如下所示。但是,我无法保留原始令牌。例如,如果我_analyze使用这个词 : saint-louis,我只会回来saintlouis,而我希望得到两个saintlouis and saint-louis,因为我有我的preserve_original set to true. 这ES version i am using is 6.3.2 and Lucene version is 7.3.1

"analysis": {
  "filter": {
    "hyphenFilter": {
      "pattern": "-",
      "type": "pattern_replace",
      "preserve_original": "true",
      "replacement": ""
    }
  },
  "analyzer": {
    "whitespace_lowercase": {
      "filter": [
        "lowercase",
        "asciifolding",
        "hyphenFilter"
      ],
      "type": "custom",
      "tokenizer": "whitespace"
    }
  }
}
4

1 回答 1

1

所以看起来令牌过滤器preserve_original不支持pattern_replace,至少在我使用的版本上不支持。

我做了一个解决方法如下:

索引定义

{
    "settings": {
        "analysis": {
            "analyzer": {
                "my_analyzer": {
                    "tokenizer": "whitespace",
                    "type": "custom",
                    "filter": [
                        "lowercase",
                        "hyphen_filter"
                    ]
                }
            },
            "filter": {
                "hyphen_filter": {
                    "type": "word_delimiter",
                    "preserve_original": "true",
                    "catenate_words": "true"
                }
            }
        }
    }
}

例如,这将标记一个单词,如anti-spamto antispam(removed the hyphen)、和anti-spam(preserved the original)antispam.

分析器 API 查看生成的令牌

发布 /_analyze

{“文本”:“反垃圾邮件”,“分析器”:“my_analyzer”}

分析 API 的输出,即。生成的令牌

{
    "tokens": [
        {
            "token": "anti-spam",
            "start_offset": 0,
            "end_offset": 9,
            "type": "word",
            "position": 0
        },
        {
            "token": "anti",
            "start_offset": 0,
            "end_offset": 4,
            "type": "word",
            "position": 0
        },
        {
            "token": "antispam",
            "start_offset": 0,
            "end_offset": 9,
            "type": "word",
            "position": 0
        },
        {
            "token": "spam",
            "start_offset": 5,
            "end_offset": 9,
            "type": "word",
            "position": 1
        }
    ]
}
于 2020-03-02T19:23:43.050 回答