我正在尝试Spark Streaming + Kafka 集成指南(Kafka 代理版本 0.10.0 或更高版本)中的示例代码。代码可以正常运行,但我收不到任何记录。如果我运行 kafka-console-consumer.sh --from-beginning,我可以获得记录。有谁知道原因?我的代码如下:
val broker = "221.181.73.44:19092"
val topics = Array("connect-test")
val groupid = "SparkStreamingLoad3"
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> broker,
"group.id" -> groupid,
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"auto.offset.reset" -> "earliest", //earliest | latest
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val stream = KafkaUtils.createDirectStream[String, String](ssc, PreferConsistent, Subscribe[String, String](topics, kafkaParams))
stream.print()
ssc.start()
ssc.awaitTermination()
我的 SBT 版本是:
version := "1.0"
scalaVersion := "2.10.6"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming-kafka-0-10_2.10" % "2.1.0",
"org.apache.spark" % "spark-core_2.10" % "2.1.0",
"org.apache.spark" % "spark-streaming_2.10" % "2.1.0",
"org.apache.kafka" % "kafka_2.10" % "0.10.2.1"
)
谢谢!