构建.sbt
lazy val commonSettings = Seq(
organization := "com.me",
version := "0.1.0",
scalaVersion := "2.11.0"
)
lazy val counter = (project in file("counter")).
settings(commonSettings:_*)
计数器/build.sbt
name := "counter"
mainClass := Some("Counter")
scalaVersion := "2.11.0"
val sparkVersion = "2.1.1";
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-8" % sparkVersion;
libraryDependencies += "com.github.scopt" %% "scopt" % "3.5.0";
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1";
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test";
mergeStrategy in assembly := {
case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
case x => (mergeStrategy in assembly).value(x)
}
计数器.scala:
object Counter extends SignalHandler
{
var ssc : Option[StreamingContext] = None;
def main( args: Array[String])
跑
./spark-submit --class "Counter" --master spark://10.1.204.67:6066 --deploy-mode cluster file://counter-assembly-0.1.0.jar
错误:
17/06/21 19:00:25 INFO Utils: Successfully started service 'Driver' on port 50140.
17/06/21 19:00:25 INFO WorkerWatcher: Connecting to worker spark://Worker@10.1.204.57:52476
Exception in thread "main" java.lang.ClassNotFoundException: Counter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:229)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:56)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
任何想法?谢谢
更新
我在这里遇到了问题Failed to submit local jar to spark cluster: java.nio.file.NoSuchFileException。现在,我将 jar 复制到spark-2.1.0-bin-hadoop2.7/bin
然后运行./spark-submit --class "Counter" --master spark://10.1.204.67:6066 --deploy-mode cluster file://Counter-assembly-0.1.0.jar
火花簇是 2.1.0
但是 jar 是在 2.1.1 和 Scala 2.11.0 中组装的。