使用 spark-submit 运行 spark 程序时出现以下错误。
我的 spark-cluster 是 2.0.0 版本,我使用 sbt 编译我的代码,下面是我的 sbt 依赖项。
libraryDependencies ++= Seq(
"commons-io" % "commons-io" % "2.4",
"com.google.guava" % "guava" % "19.0",
"jfree" % "jfreechart" % "1.0.13",
("org.deeplearning4j" % "deeplearning4j-core" % "0.5.0").exclude("org.slf4j", "slf4j-log4j12"),
"org.jblas" % "jblas" % "1.2.4",
"org.nd4j" % "canova-nd4j-codec" % "0.0.0.15",
"org.nd4j" % "nd4j-native" % "0.5.0" classifier "" classifier "linux-x86_64",
"org.deeplearning4j" % "dl4j-spark" % "0.4-rc3.6" ,
"org.apache.spark" % "spark-sql_2.10" % "1.3.1",
"org.apache.spark" % "spark-hive_2.10" % "1.3.1",
"org.apache.hive" % "hive-serde" % "0.14.0",
("org.deeplearning4j" % "arbiter-deeplearning4j" % "0.5.0"))
16/11/14 22:57:03 INFO hive.HiveSharedState: Warehouse path is 'file:/home/hduser/spark-warehouse'.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;
at poc.common.utilities.StockData$.fetchStockData(StockData.scala:15)
at poc.analaticsEngine.AnalaticsStockWorkBench.fetchTrainingDataSet(AnalaticsStockWorkBench.scala:69)
at poc.analaticsEngine.AnalaticsStockWorkBench.trainModel(AnalaticsStockWorkBench.scala:79)
at test.poc.analatics.StockPrediction$.testTrainSaveModel(StockPrediction.scala:21)
at test.poc.analatics.StockPrediction$.main(StockPrediction.scala:10)
at test.poc.analatics.StockPrediction.main(StockPrediction.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/14 22:57:03 INFO spark.SparkContext: Invoking stop() from shutdown hook