下面的 sbt 文件不能从 intelliJ Idea 中解析 spark-xml databricks 包,其中 as 可以在命令行中正常工作
name := "dataframes"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion="1.4.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion,
"com.databricks" %% "spark-xml" % "0.3.3"
)
resolvers ++= Seq(
"Apache HBase" at "https://repository.apache.org/content/repositories/releases",
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
resolvers += Resolver.mavenLocal
sbt 设置为 bundled 和 other 都指向本地安装的 sbt,但无论哪种方式都不起作用。
下面的包从命令行解析并完美运行
import com.databricks.spark._