0
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
    at org.elasticsearch.spark.serialization.ReflectionUtils$.org$elasticsearch$spark$serialization$ReflectionUtils$$checkCaseClass(ReflectionUtils.scala:42)
    at org.elasticsearch.spark.serialization.ReflectionUtils$$anonfun$checkCaseClassCache$1.apply(ReflectionUtils.scala:84)

似乎 scala 版本不兼容,但我看到 spark、spark 2.10 和 scala 2.11.8 的文档是可以的。

那是我的 pom.xml,这只是一个测试 spark 用 es-hadoop 写入 elasticsearch,我不知道如何解决这个异常。`

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>cn.jhTian</groupId>
    <artifactId>sparkLink</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>
    <name>${project.artifactId}</name>
    <description>My wonderfull scala app</description>
    <inceptionYear>2015</inceptionYear>
    <licenses>
        <license>
            <name>My License</name>
            <url>http://....</url>
            <distribution>repo</distribution>
        </license>
    </licenses>

    <properties>
        <encoding>UTF-8</encoding>
        <scala.version>2.11.8</scala.version>
        <scala.compat.version>2.11</scala.compat.version>

    </properties>

    <repositories>
        <repository>
            <id>ainemo</id>
            <name>xylink</name>
            <url>http://10.170.209.180:8081/nexus/content/groups/public/</url>
        </repository>
    </repositories>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.4</version><!-- 2.64 -->
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <!--<dependency>-->
            <!--<groupId>org.scala-lang</groupId>-->
            <!--<artifactId>scala-compiler</artifactId>-->
            <!--<version>${scala.version}</version>-->
        <!--</dependency>-->
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-reflect</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>2.6.4</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>3.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-hadoop</artifactId>
            <version>5.3.0 </version>
        </dependency>

        <!-- Test -->
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.10</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.specs2</groupId>
            <artifactId>specs2-core_${scala.compat.version}</artifactId>
            <version>2.4.16</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest_${scala.compat.version}</artifactId>
            <version>2.2.4</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
</project>'

这是我的代码

import org.apache.spark.{SparkConf, SparkContext}
import org.elasticsearch.spark._

/**
  * Created by jhTian on 2017/4/19.
  */
object EsWrite {
  def main(args: Array[String]) {
    val sparkConf = new SparkConf()
      .set("es.nodes", "1.1.1.1")
      .set("es.port", "9200")
      .set("es.index.auto.create", "true")
      .setAppName("es-spark-demo")
    val sc = new SparkContext(sparkConf)
    val job1 = Job("C开发工程师","http://job.c.com","c公司","10000")
    val job2 = Job("C++开发工程师","http://job.c++.com","c++公司","10000")
    val job3 = Job("C#开发工程师","http://job.c#.com","c#公司","10000")
    val job4 = Job("Java开发工程师","http://job.java.com","java公司","10000")
    val job5 = Job("Scala开发工程师","http://job.scala.com","java公司","10000")
//    val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
//    val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
//    val rdd=sc.makeRDD(Seq(numbers,airports))
    val rdd=sc.makeRDD(Seq(job1,job2,job3,job4,job5))
    rdd.saveToEs("job/info")
    sc.stop()
  }

}
case class Job(jobName:String, jobUrl:String, companyName:String, salary:String)'
4

2 回答 2

1

通常NoSuchMethodError意味着调用者编译时使用的版本与运行时在类路径中找到的版本不同(或者您在 CP 上有多个版本)。

在你的情况下,我猜这es-hadoop是针对不同版本的 Scala 构建的,我有一段时间没有使用过 maven,但我认为你需要得到一些有用的命令是mvn depdencyTree. 使用输出查看使用哪个版本的 Scalaes-hadoop构建,然后将您的项目配置为使用相同的 Scala 版本。

为了获得稳定/可重复的构建,我建议使用类似的东西maven-enforcer-plugin

<plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-enforcer-plugin</artifactId>
                <version>1.4.1</version>
                <executions>
                    <execution>
                        <id>enforce</id>
                        <configuration>
                            <rules>
                                <dependencyConvergence />
                            </rules>
                        </configuration>
                        <goals>
     <goal>enforce</goal>
    </goals>
</execution>
</executions>
</plugin>

最初可能会很烦人,但是一旦您对所有依赖项进行了排序,您就不应该再遇到这样的问题了。

于 2017-04-20T08:25:00.337 回答
0

像这样使用依赖

<dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-spark-20_2.11</artifactId>
            <version>5.2.2</version>
        </dependency>

对于火花 2.0 和 scala 2.11

于 2017-06-20T08:09:59.647 回答