1

我正在尝试在一个简单的应用程序中将 Spark Streaming 连接到 Kafka。我通过 Spark 文档中的示例创建了这个应用程序。当我尝试运行它时,我得到了这样的异常:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:59)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
    at producer.KafkaProducer$.main(KafkaProducer.scala:36)
    at producer.KafkaProducer.main(KafkaProducer.scala)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.4
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)

这是我的代码:

object KafkaProducer {

  def main(args: Array[String]) {

    val spark = SparkSession
      .builder()
      .appName("KafkaSparkStreaming")
      .master("local[*]")
      .getOrCreate()

    val ssc = new StreamingContext(spark.sparkContext, Seconds(3))
    val topics = Array("topic1", "topic2")

    def kafkaParams = Map[String, Object](
      "bootstrap.servers" -> "localhost:9092",
      "key.deserializer" -> classOf[StringDeserializer],
      "value.deserializer" -> classOf[StringDeserializer],
      "group.id" -> "1",
      "auto.offset.reset" -> "latest",
      "enable.auto.commit" -> (false: java.lang.Boolean)
    )

    val lines = KafkaUtils.createDirectStream[String, String](
      ssc,
      LocationStrategies.PreferConsistent,
      ConsumerStrategies.Subscribe[String, String](topics, kafkaParams)
    )
    lines.map(_.key())

    ssc.start()
    ssc.awaitTermination()

我不确定问题出在配置还是代码本身。这就是我的 build.sbt 文件的样子:

scalaVersion := "2.11.4"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % "2.3.0",
  "org.apache.spark" %% "spark-sql" % "2.3.0",
  "org.apache.spark" %% "spark-streaming" % "2.3.0",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0"
)

我将不胜感激任何帮助,因为我无法弄清楚出了什么问题!

4

1 回答 1

2

通过跟踪您遇到的异常的堆栈跟踪,我们可以发现主要问题是:

引起:com.fasterxml.jackson.databind.JsonMappingException:不兼容的杰克逊版本:2.9.4

事实上

Spark 2.1.0 包含 com.fasterxml.jackson.core 作为传递依赖。因此,我们不需要将 then 包含在 libraryDependencies 中。

此处针对类似问题及其解决方案进行了更详细的描述。

于 2018-04-24T17:50:57.863 回答