我写了一个spark job,主要目标是写入es,然后提交,问题是当我提交到spark集群时,spark返回
[错误] [org.apache.spark.deploy.yarn.ApplicationMaster] 用户类抛出异常:java.lang.AbstractMethodError: org.elasticsearch.spark.sql.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg /apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; java.lang.AbstractMethodError: org.elasticsearch.spark.sql.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/ spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; 在 org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:472) 在 org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:48) 在 org.apache .spark.sql.执行。
但是,如果我使用 local[2] 提交我的工作,那么工作就很好了。奇怪,两个jar的环境是一样的。我用的是elasticsearch-spark20_2.11_5.5.0和spark2.2