14

我在 pom.xml 中有一个带有 Spring Data Elasticsearch 插件的 Spring Boot 应用程序。我创建了一个我想索引的文档类:

@Document(indexName = "operations", type = "operation")
public class OperationDocument {

@Id
private Long id;

@Field(
    type = FieldType.String, 
    index = FieldIndex.analyzed, 
    searchAnalyzer = "standard", 
    indexAnalyzer = "standard",
    store = true
)
private String operationName;

@Field(
    type = FieldType.Date, 
    index = FieldIndex.not_analyzed, 
    store = true, 
    format = DateFormat.custom, pattern = "dd.MM.yyyy hh:mm"
)
private Date dateUp;

@Field(
    type = FieldType.String, 
    index = FieldIndex.not_analyzed, 
    store = false
) 
private String someTransientData;

@Field(type = FieldType.Nested)
private List<Sector> sectors;

//Getter and setters

我还为这个类创建了一个存储库:

 public interface OperationDocumentRepository 
      extends ElasticsearchRepository<OperationDocument, Long> {
 }

我做了一个测试,使用存储库索引三个示例对象。它很长,所以我会发布它只需要。事实上,在 ES 服务器中创建的映射忽略了 @Field 注解设置的配置:

"mappings": {
  "operation": {
    "properties": {
      "operationName": {
        "type": "string"
      },
      "dateUp": {
        "type": "long"
      },
      "someTransientData": {
        "type": "string"
      },
      "sectors": {
        "properties": {
          "id": {
            "type": "long"
          },
          "sectorName": {
            "type": "string"
          }
        }
      }
    }
  }
}

没有关于分析器的信息,“someTransientData”被存储和索引,并且 dateUp 被键入为 Long 而不是 Date。

直接从服务器请求的示例文档:

 {
   "_index": "operations",
   "_type": "operation",
   "_id": "AUyUk2cY3nXeOFxdOlQW",
   "_version": 1,
   "_score": 1,
   "_source": {
     "id": null,
     "operationName": "Second Operation Name",
     "dateUp": 1428421827091,
     "someTransientData": "Do not index or store",
     "sectors": [
       {
         "id": 2,
         "sectorName": "Health Care"
       },
       {
         "id": 3,
         "sectorName": "Construction"
       }
     ]
   }
 }

我还注意到,当我第二次运行应用程序时,在启动时出现此错误,仅在索引已经存在时打印:

错误 19452 --- [main] .dersAbstractElasticsearchRepository:加载弹性搜索节点失败:org.elasticsearch.index.mapper.MergeMappingException:合并失败并失败{[mapper [someTransientData] 具有不同的索引值,mapper [someTransientData] 具有不同的标记化值, mapper [someTransientData] 有不同的 index_analyzer, 对象映射 [sectors] 不能从非嵌套更改为嵌套, mapper [operationName] 有不同的存储值, mapper [operationName] 有不同的 index_analyzer, mapper [dateUp] 的类型不同,current_type [long],merged_type [date]]}

这是 Spring Data Elastic Search 的错误还是我做错了什么?

我尝试了 spring boot 提供的稳定版本和 spring-data-elasticsearch 的最后一个快照。我还尝试了插件提供的嵌入式 Elasticsearch 服务器和当前版本的外部服务器。我总是得到相同的结果。

4

3 回答 3

18

我终于可以复制并解决问题了。事实是我使用 ElasticTemplate 来索引和搜索文档而不是存储库,因为我的业务逻辑变得更加复杂(使用聚合等)。

之后,我删除了未使用的 OperationDocumentRespository。在启动时发布到 ES 服务器的类型映射似乎需要存储库。我认为拥有@Document 类就足够了,但事实并非如此。

所以我们在这里有两个选择:

  • 保留 OperationDocumentRepository
  • 将此行添加到应用启动:

    elasticsearchTemplate.putMapping(OperationDocument.class);
    
于 2015-04-24T07:56:06.790 回答
2

我尝试使用 spring data elasticsearch 示例应用程序复制问题,但根据您的配置,我得到了如上所述的所需结果。

整个代码都提交到这里的项目 -->链接

查看加载弹簧上下文时生成索引并应用映射的TestCase-->链接

这是由 TestCase 生成的映射:

  {
  "operations" : {
    "aliases" : { },
    "mappings" : {
      "operation" : {
        "properties" : {
          "dateUp" : {
            "type" : "date",
            "store" : true,
            "format" : "dd.MM.yyyy hh:mm"
          },
          "operationName" : {
            "type" : "string",
            "store" : true,
            "analyzer" : "standard"
          },
          "sectors" : {
            "type" : "nested"
          },
          "someTransientData" : {
            "type" : "string",
            "index" : "not_analyzed"
          }
        }
      }
    },
    "settings" : {
      "index" : {
        "refresh_interval" : "1s",
        "number_of_shards" : "5",
        "store" : {
          "type" : "fs"
        },
        "creation_date" : "1428677234773",
        "number_of_replicas" : "1",
        "version" : {
          "created" : "1040499"
        },
        "uuid" : "-djzLu-IQ0CBs-M6R0-R6Q"
      }
    },
    "warmers" : { }
  }
}

您可以使用https://github.com/spring-projects/spring-boot/tree/master/spring-boot-samples/spring-boot-sample-data-elasticsearch使用 Spring Boot 创建类似的示例吗

并承诺公开份额?

于 2015-04-10T15:20:43.420 回答
0

I encountered this error as well using spring-data-elasticsearch and got a solution. Just put codes below, please check the comments. ES prohibited changing field types after indexes created. However you can change other attributes like fielddata.

  • Spring Data Elasticsearch 3.2.x
  • Elasticsearch 6.8.4
  • Spring Boot 2.2.x

package com.xxxx.xx.es.entity;

import lombok.Builder;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import org.springframework.data.elasticsearch.annotations.Mapping;

@EqualsAndHashCode(callSuper = true)
@Data
@Builder
@Document(indexName = "gaming-laptop", indexStoreType = "fs")
@Mapping(mappingPath = "/gaming-laptop-mappings.json")   // use custom json to configure data structure in ES, default spring data won't work although using @Field(type = FieldType.Keyword)
public class GamingLaptop extends BaseEntity {

    @Id
    private String id;

    @Field
    private String cpu;

    @Field(type = FieldType.Keyword)  // this only works in ES 7.6.2+, Spring Data Elasticsearch 4.0.X+
    private String brandName;

    @Field
    private Integer cores;

    @Field
    private Integer threads;

    @Field
    private String coreFrequency;

    @Field
    private String ssd;

    @Field
    private String ram;

    @Field
    private String produceDate;

    @Field
    private Integer version;

    @Field
    private Double price;

    @Field(type = FieldType.Keyword)
    private String description;

}

resources/gaming-laptop-mappings.json

{
    "properties":{
        "_class":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "brandName":{
            "type":"keyword"
        },
        "coreFrequency":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "cores":{
            "type":"long"
        },
        "cpu":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "description":{
            "type":"keyword"
        },
        "gmtCreated":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "gmtModified":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "price":{
            "type":"float"
        },
        "produceDate":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "ram":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "ssd":{
            "type":"text",
            "fields":{
                "keyword":{
                    "type":"keyword",
                    "ignore_above":256
                }
            }
        },
        "threads":{
            "type":"long"
        },
        "version":{
            "type":"long"
        }
    }
}

Alternatively you can adapt Javier Alvarez's method by using rest high level client: elasticsearchTemplate.putMapping(OperationDocument.class);

于 2020-09-25T06:29:32.480 回答