0

我正在尝试将 GraphFrames 添加到我的 scala spark 应用程序中,当我添加基于 2.10 的应用程序时一切正常。但是,一旦我尝试使用 Scala 2.11 构建的 GraphFrames 构建它,它就会中断。

问题是使用了冲突的 scala 版本(2.10 和 2.11)。我收到以下错误:

[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error]    org.apache.spark:spark-launcher _2.10, _2.11
[error]    org.json4s:json4s-ast _2.10, _2.11
[error]    org.apache.spark:spark-network-shuffle _2.10, _2.11
[error]    com.twitter:chill _2.10, _2.11
[error]    org.json4s:json4s-jackson _2.10, _2.11
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error]    org.json4s:json4s-core _2.10, _2.11
[error]    org.apache.spark:spark-unsafe _2.10, _2.11
[error]    org.apache.spark:spark-core _2.10, _2.11
[error]    org.apache.spark:spark-network-common _2.10, _2.11

但是,我无法解决导致此问题的原因。这是我的完整 build.sbt:

import sbt._
import Keys._
import scala._


lazy val root = (project in file("."))
.settings(
    name := "example-hcl-spark-scala-graphx-bitcointransaction",
    version := "0.1"
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)

scalacOptions += "-target:jvm-1.7"

crossScalaVersions := Seq("2.11.8")

resolvers += Resolver.mavenLocal

fork  := true

jacoco.settings

itJacoco.settings



assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"

libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"


libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"

libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"

谁能查明哪个依赖项基于 scala 2.10 导致构建失败?

4

1 回答 1

0

我发现了问题所在。显然,如果您使用:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

它默认使用 2.10 版本。一旦我将 spark core 和 spark graphx 的依赖关系更改为:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"
于 2017-10-22T19:59:30.380 回答