0

我一直在尝试在我的 spark 应用程序中使用 elastic4s,但每次它尝试将数据发送到我的 elasticsearch 节点时,我都会不断收到:

java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
        at org.elasticsearch.threadpool.ThreadPool.<clinit>(ThreadPool.java:190)
        at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)
        at com.sksamuel.elastic4s.ElasticClient$.transport(ElasticClient.scala:111)
        at com.sksamuel.elastic4s.ElasticClient$.remote(ElasticClient.scala:92)

不知道我什至可以从哪里开始调试这个错误。代码相当简单:

val elasticAddress = getEnvirometalParameter("streaming_pipeline", "elastic_address")(0)._1
  val uri = ElasticsearchClientUri("elasticsearch://" + elasticAddress)
  val client = ElasticClient.remote(uri)

  def elasticInsert(subject:String, predicate:String, obj:String, label:String) = {
    client.execute {
      update id (label + subject + predicate + obj) in "test" / "quad"  docAsUpsert (
        "subject" -> subject,
        "predicate" -> predicate,
        "object" -> obj,
        "label" -> label
        )
    }
  }
4

1 回答 1

0

问题是 Elasticsearch 和 Spark 在它们的 Netty 版本(和其他依赖项)上发生冲突。版本不兼容,您在运行时会遇到这些类型的异常。

Elastic4s的 5.3 版开始,最好的办法是使用不依赖于 Netty 或 Guava 之类的东西的HttpClient 。

于 2017-07-14T22:18:04.180 回答