我正在使用 spark graphx 在 jobserver (v0.6.2 spark 1.6.1) 上开发 SparkJob,尝试在 Spark JobServer 上启动我的作业时遇到以下异常:
{
"status": "JOB LOADING FAILED",
"result": {
"errorClass": "java.lang.NoClassDefFoundError",
"cause": "org.apache.spark.graphx.VertexRDD",
"stack": ["java.net.URLClassLoader.findClass(URLClassLoader.java:381)", "java.lang.ClassLoader.loadClass(ClassLoader.java:424)", "java.lang.ClassLoader.loadClass(ClassLoader.java:357)", "java.lang.Class.getDeclaredFields0(Native Method)", "java.lang.Class.privateGetDeclaredFields(Class.java:2583)", "java.lang.Class.getField0(Class.java:2975)", "java.lang.Class.getField(Class.java:1701)", "spark.jobserver.util.JarUtils$.loadObject(JarUtils.scala:61)", "spark.jobserver.util.JarUtils$.loadClassOrObject(JarUtils.scala:37)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:46)", "spark.jobserver.JobCache$$anonfun$getSparkJob$1.apply(JobCache.scala:37)", "spark.jobserver.util.LRUCache.get(LRUCache.scala:35)", "spark.jobserver.JobCache.getSparkJob(JobCache.scala:37)", "spark.jobserver.JobManagerActor$$anonfun$startJobInternal$1.apply$mcV$sp(JobManagerActor.scala:216)", "scala.util.control.Breaks.breakable(Breaks.scala:37)", "spark.jobserver.JobManagerActor.startJobInternal(JobManagerActor.scala:192)", "spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:144)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)", "ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)", "ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)", "scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)", "ooyala.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:24)", "akka.actor.Actor$class.aroundReceive(Actor.scala:467)", "ooyala.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)", "akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)", "akka.actor.ActorCell.invoke(ActorCell.scala:487)", "akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)", "akka.dispatch.Mailbox.run(Mailbox.scala:220)", "akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)", "scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)", "scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)", "scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)", "scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)"],
"causingClass": "java.lang.ClassNotFoundException",
"message": "org/apache/spark/graphx/VertexRDD"
}
尽管我已经在我的 build.sbt 和 jobserver 上的 Dependecy.scala 中包含了 graphx 依赖项。
有什么帮助吗?