1

我正在编写一个脚本来尝试让 Cassandra 和 Spark 一起工作,但我什至无法编译该程序。我使用 SBT 作为构建工具,并且我拥有声明的程序所需的所有依赖项。我第一次运行 sbt run 它下载了依赖项,但是当它开始编译如下所示的 scala 代码时会出现错误:

[info] Compiling 1 Scala source to /home/vagrant/ScalaTest/target/scala-2.10/classes...
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:6: not found: type SparkConf
[error]                 val conf = new SparkConf(true)
[error]                                ^
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:9: not found: type SparkContext
[error]                 val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
[error]                              ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Jun 5, 2015 2:40:09 PM

这是 SBT 构建文件

lazy val root = (project in file(".")).
        settings(
                name := "ScalaTest",
                version := "1.0"

        )


libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1"

这是实际的 Scala 程序

import com.datastax.spark.connector._

object ScalaTest {
        def main(args: Array[String]) {
                val conf = new SparkConf(true)
                        .set("spark.cassandra.connection.host", "127.0.0.1")

                val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
        }
}

这是我的目录结构

- ScalaTest
  - build.sbt
  - project
  - src
    - main
      - scala
        - ScalaTest.scala
  - target
4

1 回答 1

2

我不知道这是否是问题所在,但您没有导入SparkConfSparkContext类定义。因此尝试添加到您的 scala 文件中:

 import org.apache.spark.SparkConf
 import org.apache.spark.SparkContext
于 2015-06-05T14:53:30.073 回答