0

我想使用 Kafka 集成来进行 Spark 流式传输。我使用 Spark 2.0.0 版。

但我得到一个未解决的依赖错误(“未解决的依赖:org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found”)。

我怎样才能访问这个包?还是我做错了什么/错过了什么?

我的 build.sbt 文件:

name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion,
    "org.apache.spark" %% "spark-sql" % sparkVersion,
    "org.apache.spark" %% "spark-streaming" % sparkVersion,
    "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"

谢谢你的帮助。

4

1 回答 1

2

https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11

libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"
于 2017-08-30T12:41:59.937 回答