我有一个用 scala 编写的程序,使用 spark,当我启动它时它在本地运行良好sbt run
我希望能够使用 sbt 在亚马逊的 ec2 集群上运行它。这可能吗?怎么做?
我查看了http://spark.incubator.apache.org/docs/latest/ec2-scripts.html,但这似乎没有 sbt。
我的 sbt 版本:
~/git-reps/cs262a $ sbt --version
sbt launcher version 0.12.4
我的build.sbt
文件:
name := "Ensemble Bayes Tree"
version := "1.0"
scalaVersion := "2.9.3"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "0.8.0-incubating",
"org.apache.spark" % "spark-mllib_2.9.3" % "0.8.0-incubating",
"org.slf4j" % "slf4j-api" % "1.6.4",
"org.slf4j" % "slf4j-log4j12" % "1.7.5",
"log4j" % "log4j" % "1.2.14",
"org.eclipse.jetty.orbit" % "javax.servlet" % "2.5.0.v201103041518" artifacts Artifact("javax.servlet", "jar", "jar")
)
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"