我是 spark 集群的新手,实际上我正在运行 spark 网站上给出的示例。
/*** SimpleJob.scala ***/
import spark.SparkContext
import SparkContext._
object SimpleJob extends Application {
val logFile = "//var//log//syslog" // Should be some file on your system
val jarName:Seq[String] = "target//scala-2.9.2//simple-project_2.9.2-1.0.jar"
val sc = new SparkContext("local", "Simple Job", "/home/subodh/Downloads/spark-0.6.1/bin",jarName)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
当我使用 sbt package 命令编译它时,它给了我以下错误:
[error] /home/subodh/Downloads/spark-0.6.1/example/src/main/scala/SimpleJob.scala:7: type mismatch;
**[error] found : java.lang.String("target//scala-2.9.2//simple-project_2.9.2-1.0.jar")
[error] required: Seq[String]**
[error] val jarName:Seq[String] = "target//scala-2.9.2//simple-project_2.9.2-1.0.jar"
[error] ^
[error] one error found
[error] {file:/home/subodh/Downloads/spark-0.6.1/example/}default-9e9e7d/compile:compile: Compilation failed
[error] Total time: 3 s, completed Jan 31, 2013 11:31:21 PM
感谢您在这方面的任何帮助。