2

编译 Maven 项目时出现以下错误:

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-streaming-flume-sink_2.10 --- 
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile 
[INFO] Using incremental compilation 
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes... 
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found. 
[ERROR] with Logging { 
[ERROR] ^ 
[ERROR] one error found 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 7.992s 
[INFO] Finished at: Fri Apr 15 17:44:33 CEST 2016 
[INFO] Final Memory: 25M/350M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> 
[Help 1] 
[ERROR]

<useZincServer>true</useZincServer>从 pom.xml 中删除了该属性,但日志记录错误仍然存​​在。

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-streaming-flume-sink_2.10 --- 
[INFO] Using incremental compilation 
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes... 
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found. 
[ERROR] with Logging { 
[ERROR] ^ 
[ERROR] one error found 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 5.814s 
[INFO] Finished at: Fri Apr 15 17:41:00 CEST 2016 
[INFO] Final Memory: 25M/335M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> 
[Help 1] 
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

我检查PATHJAVA_HOME定义~/.bashrc如下:

export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64

我注意到的唯一问题是它echo $JAVA_HOME给出了一个空输出,尽管我做了source ~/.bashrc.

非常感谢任何帮助。

4

3 回答 3

1

问题可能是这个[INFO] Using incremental compilation

在您的 pom.xml 中尝试删除该行
<recompileMode>incremental</recompileMode>
,然后重试。

于 2018-05-09T01:03:09.213 回答
0

我认为您正在使用 Scala 2.10 编译 Spark。如果是这样,您应该执行以下操作。

cd /path/to/Spark
./dev/change-scala-version.sh 2.10
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package

希望这可以帮助。

于 2016-11-20T19:06:40.473 回答
0

奇怪的是 echo $JAVA_HOME 给出了一个空输出。在编译 Spakr 源时,我将 mvn clean 包成功项目导入到 eclipse 中,我遇到了同样的问题。我在这里找到了解决方案: 如何解决“插件执行未覆盖Spring Data Maven 构建的生命周期配置”

于 2016-08-27T04:50:49.713 回答