我遇到了与此处遇到的错误类似的错误- 我可以使用 spark shell 运行 GraphX,但是当我尝试在 jar 文件上使用 spark-submit 时遇到 NoSuchMethodError。这是它抱怨的行:
val myGraph: Graph[(String, Long, String), Int] = Graph.apply(userRecords, userConnectionEdges)
这给了我以下错误:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.graphx.
Graph$.apply$default$4()Lorg/apache/spark/storage/StorageLevel;
at MyProject$.main(MyProject.scala:53)
at MyProject.main(MyProject.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
代码使用 sbt 程序集构建,所以我不是出了什么问题。
编辑:我创建了一个新的 scala 项目来从这里获取代码并将其构建到一个 jar 文件中。这是 scala 文件:
/* GraphTest.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD
object GraphTest {
def main(args: Array[String]) {
// Set up environment
val conf = new SparkConf()
val sc = new SparkContext(conf)
// Set up the vertices
val vertexArray = Array(
(1L, ("Alice", 28)),
(2L, ("Bob", 27)),
(3L, ("Charlie", 65)),
(4L, ("David", 42)),
(5L, ("Ed", 55)),
(6L, ("Fran", 50))
)
// Set up the edges
val edgeArray = Array(
Edge(2L, 1L, 7),
Edge(2L, 4L, 2),
Edge(3L, 2L, 4),
Edge(3L, 6L, 3),
Edge(4L, 1L, 1),
Edge(5L, 2L, 2),
Edge(5L, 3L, 8),
Edge(5L, 6L, 3)
)
// Convert arrays to RDDs
val vertexRDD: RDD[(Long, (String, Int))] = sc.parallelize(vertexArray)
val edgeRDD: RDD[Edge[Int]] = sc.parallelize(edgeArray)
// Create graph and print vertex data
val graph: Graph[(String, Int), Int] = Graph(vertexRDD, edgeRDD)
graph.vertices.filter { case (id, (name, age)) => age > 30 }.collect.foreach {
case (id, (name, age)) => println(s"$name is $age")
}
}
}
以下是构建设置:
import AssemblyKeys._
assemblySettings
name := "graphtest"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.10" % "1.2.1" % "provided"
我可以在代码上运行 sbt 程序集,但是当我运行时
..\spark\bin\spark-submit --class GraphTest target\scala-2.10\graphtest-assembly-1.0.jar
我得到 NoSuchMethodError。