0

我正在尝试在 CentOS 上安装 Spark。使用命令构建 sparksbt/sbt assembly时,会出现以下错误。

[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     getOutputCommitter().cleanupJob(getJobContext())
[warn]                          ^
[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     jobCommitter.cleanupJob(jobTaskContext)
[warn]                  ^
[warn] two warnings found
[error] ----------
[error] 1. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)
[error]         import io.netty.channel.ChannelFuture;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFuture is never used
[error] ----------
[error] 2. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
[error]         import io.netty.channel.ChannelFutureListener;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFutureListener is never used
[error] ----------
[error] ----------
[error] 3. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
[error]         import io.netty.channel.Channel;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.Channel is never used
[error] ----------
[error] ----------
[error] 4. WARNING in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)
[error]         import java.util.Arrays;
[error]                ^^^^^^^^^^^^^^^^
[error] The import java.util.Arrays is never used
[error] ----------
[error] ----------
[error] 5. ERROR in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)
[error]         public final Iterable<Double> apply(T t) { return call(t); }
[error]                                       ^^^^^^^^^^
[error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method
[error] ----------
[error] 5 problems (1 error, 4 warnings)
[error] (core/compile:compile) javac returned nonzero exit code
[error] Total time: 431 s, completed Oct 24, 2013 7:42:21 AM

我机器上安装的java版本是1.7.0_45。
早些时候我使用了 jdk 1.6.0_35,它给出了相同的错误集。我还尝试了 java 1.4,它给出了不同类型的错误。我应该使用哪个版本的 java?还是其他问题?

4

1 回答 1

0

这看起来很像 sbt 引入了旧版本的 Java。Spark 至少需要 1.6,但不知何故 sbt 正在拉入那个旧版本。

我按照以下链接在 CentOS 5 上安装 java 6 并删除其他 JDK,它现在正在工作。

http://kurinchilamp.kurinchilion.com/2012/11/how-to-install-java-6-on-centos-5-and-remove-other-jdk-jre-version.html

于 2013-10-25T07:21:46.023 回答