0

我遇到了构建问题。这是我的 sbt 文件:

name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"

// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"

// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"

scalacOptions ++= Seq("-feature")

这是我看到的完整错误消息:

[info] Set current project to SparkPi (in build file:/Users/xxx/prog/yyy/)
    [info] Updating {file:/Users/xxx/prog/yyy/}yyy...
    [info] Resolving jline#jline;2.12.1 ...
    [info] Done updating.
    [info] Compiling 2 Scala sources to /Users/xxx/prog/yyy/target/scala-2.11/classes...
    [error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:6: object profile is not a member of package com.amazonaws.auth
    [error] import com.amazonaws.auth.profile._
    [error]                           ^
    [error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:87: not found: type ProfileCredentialsProvider
    [error]     val creds = new ProfileCredentialsProvider(profile).getCredentials()
    [error]                     ^
    [error] two errors found
    [error] (compile:compileIncremental) Compilation failed
    [error] Total time: 14 s, completed Nov 3, 2016 1:43:34 PM

这是我正在尝试使用的导入:

import com.amazonaws.services.s3._
import com.amazonaws.auth.profile._

如何com.amazonaws.auth.profile.ProfileCredentialsProvider在 Scala 中导入?

编辑

更改 sbt 文件,使 spark core 版本对应 Scala 版本,新内容:

name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"

// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"

// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"

scalacOptions ++= Seq("-feature")
4

1 回答 1

0

您正在使用scalaVersion := "2.11.8",但库依赖项有下划线 2.10 spark-core_2.10,这很糟糕。

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
                                                          ^

更改2.102.11

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"

`

于 2016-11-03T19:49:50.290 回答