3

根据生产者配置的文档,参考 Apache Kafka 的 0.9.0.0 版本:

http://kafka.apache.org/documentation.html#producerconfigs

我需要使用以下属性来指定经纪人列表:

props.put("bootstrap.servers", "localhost:9092")

这是我的制作人课程:

  def main(args: Array[String]) {
    //val conf = new SparkConf().setAppName("VPP metrics producer")
    //val sc = new SparkContext(conf)

    val props: Properties = new Properties()
      props.put("bootstrap.servers", "localhost:9092")
      props.put("key.serializer", "kafka.serializer.StringEncoder")
      props.put("value.serializer", "kafka.serializer.StringEncoder")

    val config = new ProducerConfig(props)
    val producer = new Producer[String, String](config)

    1 to 10000 foreach {
      case i => 
        val jsonStr = getRandomTsDataPoint().toJson.toString()
        println(s"sending message $i to kafka")
        producer.send(new KeyedMessage[String, String]("test_topic", jsonStr))
        println(s"sent message $i to kafka")
    }
  }

这是我的依赖:

object Dependencies {
  val resolutionRepos = Seq(
    "Spray Repository" at "http://repo.spray.cc/"
  )

  object V {
    val spark     = "1.6.0"
    val kafka     = "0.9.0.0"
    val jodaTime  = "2.7"
    val sprayJson = "1.3.2"
    // Add versions for your additional libraries here...
  }

  object Libraries {
    val sparkCore   = "org.apache.spark"           %% "spark-core"            % V.spark 
    val kafka       = "org.apache.kafka"           %% "kafka"                 % V.kafka
    val jodaTime    = "joda-time"                  % "joda-time"              % V.jodaTime
    val sprayJson   = "io.spray"                   %% "spray-json"            % V.sprayJson
  }
}

如您所见,我使用的是 0.9.0.0 版本的 Apache Kafka。当我尝试运行我的 Producer 类时,我收到以下错误:

Joes-MacBook-Pro:spark-kafka-producer joe$ java -cp target/scala-2.11/spark-example-project-0.1.0-SNAPAHOT.jar com.eon.vpp.MetricsProducer
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Missing required property 'metadata.broker.list'
    at scala.Predef$.require(Predef.scala:219)
    at kafka.utils.VerifiableProperties.getString(VerifiableProperties.scala:177)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:66)
    at kafka.producer.ProducerConfig.<init>(ProducerConfig.scala:56)
    at com.eon.vpp.MetricsProducer$.main(MetricsProducer.scala:45)
    at com.eon.vpp.MetricsProducer.main(MetricsProducer.scala)

为什么是这样?我什至验证了我的 jar 文件的内容,它使用了 0.9.0.0 版本的 Apache Kafka!(kafka_2.11-0.9.0.0.jar)

4

1 回答 1

1

Spark 1.6.0 目前不支持 Kafka 0.9。您将不得不等到 Spark 2.0.0。检查这个问题:https ://issues.apache.org/jira/browse/SPARK-12177

于 2016-01-25T16:04:31.363 回答