1

当我尝试使用 spark-cassandra-connector 连接到 cassandra 时出现此错误:

线程“main”中的异常 java.lang.NoClassDefFoundError: com/datastax/driver/core/ProtocolOptions$Compression at com.datastax.spark.connector.cql.CassandraConnectorConf$.(CassandraConnectorConf.scala:112) at com.datastax.spark .connector.cql.CassandraConnectorConf$.(CassandraConnectorConf.scala) at com.datastax.spark.connector.cql.CassandraConnector$.apply(CassandraConnector.scala:192) at com.datastax.spark.connector.SparkContextFunctions.cassandraTable$default$3 (SparkContextFunctions.scala:48) 在 main.scala.TestSpark$.main(TestSpark.scala:19) 在 main.scala.TestSpark.main(TestSpark.scala) 在 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 在 sun .reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 在 sun.reflect.DelegatingMethodAccessorImpl。在 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain( SparkSubmit.scala:672) 在 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) 在 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 在 org. apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 原因:java.lang.ClassNotFoundException: com.datastax.driver.core .ProtocolOptions$Compression at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java .net.URLClassLoader。findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader .java:358) ... 还有 15 个我在 spark 类路径 spark-cassandra-connector_2.11-1.5.0-M2.jar 中添加了 jar

我在 sbt 文件中添加了依赖项:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.5.0-M2"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.5.0-M2"

这是我尝试执行的 scala 程序:

package main.scala


import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._

/**
 * Created by Simo on 01.12.15.
 */
object TestSpark {
  def main(args: Array[String]) {
   val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "54.229.218.236")
        .setAppName("Simple Application")
    val sc= new SparkContext("local", "test", conf)
    val rdd = sc.cassandraTable("test", "kv")
    println(rdd.count)
    println(rdd.first)
    println(rdd.map(_.getInt("value")).sum)
  }
}

这就是我运行它的方式:

$ sbt package
$ $SPARK_HOME/bin/spark-submit --class "main.scala.TestSpark" target/scala-2.11/simple-project_2.11-1.0.jar

你能帮我理解我做错了什么吗?

谢谢!

编辑:

我尝试在依赖项列表和 spark 类路径中添加 Datastax 驱动程序:

libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core" % "2.1.9"
libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-mapping" % "2.1.9"

最后一个错误不再出现,但现在我有另一个错误:

线程“主”java.lang.NoSuchMethodError 中的异常:scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef; 在 com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala) 在 com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2。在 com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) 在 com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache. scala:31) 在 com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) 在 com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) 在 com.datastax .spark.connector.cql.CassandraConnector。

编辑2:在编译时制作scala 2.10.6(与spark的scala版本相同)以前的错误不再出现,但我有这个新错误:

com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:36) 处的 com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:36) 处的线程“main”中的异常 java.lang.NoClassDefFoundError: com/google/common/util/concurrent/AsyncFunction。 spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala: 155) 在 com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) 在 com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala: 150) 在 com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) 在 com.datastax。spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) 在 com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) 在 com.datastax.spark.connector.cql.CassandraConnector。 withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120) at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241 ) 在 com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:59) 在 com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:51)。 datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:59) 在 com.datastax.spark.connector.rdd。CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:150) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD. scala:143) 在 org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) 在 org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD. scala:237) 在 org.apache.spark.rdd.RDD.partitions(RDD.scala:237) 在 org.apache.spark.SparkContext.runJob(SparkContext.scala) 的 scala.Option.getOrElse(Option.scala:120) :1919) 在 org.apache.spark.rdd.RDD.count(RDD.scala:1121) 在 main.scala.TestSpark$.main(TestSpark.scala:20) 在 main.scala.TestSpark.main(TestSpark.scala ) 在 sun.reflect.NativeMethodAccessorImpl。在 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在 java.lang.reflect.Method.invoke(Method.java:43) 的 invoke0(Native Method) 606) 在 org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) 在 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit. scala:180) 在 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) 在 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) 在 org.apache.spark .deploy.SparkSubmit.main(SparkSubmit.scala) 引起:java.lang.ClassNotFoundException:com.google.common.util.concurrent.AsyncFunction at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java。网。URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java :425) 在 sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 在 java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 34 更多

4

2 回答 2

2

最终按照@Odomontois 的建议使用 sbt-assembly 解决

这是最终的 build.sbt:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.6"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1" % "provided"

libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core" % "2.1.9"

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.5.0-M2"



jarName in assembly :="my-project-assembly.jar"

assemblyOption in assembly := (assemblyOption in             assembly).value.copy(includeScala = false)


resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
    {
        case PathList("netty", "handler", xs @ _*)         => MergeStrategy.first
        case PathList("netty", "buffer", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "common", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "transport", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "codec", xs @ _*)     => MergeStrategy.first

        case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.first
        case x => old(x)
        }
    }
于 2015-12-02T13:14:53.200 回答
0

您还需要从(根据 Spark-cassandra 连接器的版本)添加 Datastax Cassandra Driver 的依赖项: - https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/

于 2015-12-02T03:54:02.907 回答