1

在 java 上设置 spark 来访问 cassandra 会抛出 NoClassDefFoundError

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(Unknown Source)
    at java.security.SecureClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.defineClass(Unknown Source)
    at java.net.URLClassLoader.access$100(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at Client.main(Client.java:22)
Caused by: java.lang.ClassNotFoundException: scala.Cloneable
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    ... 13 more

添加了两个 jar 文件。spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar & spark-core_2.10-0.9.0-incubating.jar。spark-cassandra-connector-java-assembly-1.4.0-M1-SNAPSHOT.jar 是针对 scala 2.10 构建的。在显示 scala 代码运行器版本 2.11.6 的命令提示符下键入 scala -version。从 spark-shell 访问 spark 没有问题。即使从 spark-shell 访问 cassandra 列族也可以正常工作。

import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import com.datastax.spark.connector.*;
import com.datastax.spark.connector.cql.*;
import com.datastax.spark.*;
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.PairFunction;
//import scala.Tuple2;
import org.apache.spark.api.java.*;

public class Client {
    public static void main(String[] a)
    {
        SparkConf conf = new SparkConf().setAppName("MTMPNLTesting").setMaster("192.168.1.15");
    }
}

错误的原因可能是什么?

4

1 回答 1

1

尝试在你的类路径中包含Scala Jar 。如果您不使用 Maven,请下载 jar 并将其包含在项目构建属性中。

于 2015-06-24T14:24:02.890 回答