2

嗨,我是 Scala 和 Intellij 的菜鸟,我只是想在 Scala 上做到这一点:

import org.apache.spark
import org.apache.spark.sql.SQLContext
import com.databricks.spark.xml.XmlReader


object SparkSample {
  def main(args: Array[String]): Unit = {
    val conf = new spark.SparkConf()
    conf.setAppName("Datasets Test")
    conf.setMaster("local[2]")
    val sc = new spark.SparkContext(conf)

    val sqlContext = new SQLContext(sc)
    val df = sqlContext.read
      .format("com.databricks.spark.xml")
      .option("rowTag", "shop")
      .load("shops.xml") /* NoSuchMethod error here */

    val selectedData = df.select("author", "_id")
  df.show
}

基本上我正在尝试将 XML 转换为 spark 数据框我在 '.load("shops.xml")' 中收到 NoSuchMethod 错误,下面是 SBT

version := "0.1"

scalaVersion := "2.11.3"
val sparkVersion = "2.0.0" 
val sparkXMLVersion = "0.3.3"

libraryDependencies ++= Seq(
  "org.apache.spark"      %%  "spark-core"      %   sparkVersion  exclude("jline", "2.12"),
  "org.apache.spark"      %% "spark-sql"        % sparkVersion excludeAll(ExclusionRule(organization = "jline"),ExclusionRule("name","2.12")),
   "com.databricks"        %% "spark-xml"        % sparkXMLVersion,
)

下面是跟踪:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.types.DecimalType$.Unlimited()Lorg/apache/spark/sql/types/DecimalType;
at com.databricks.spark.xml.util.InferSchema$.<init>(InferSchema.scala:50)
at com.databricks.spark.xml.util.InferSchema$.<clinit>(InferSchema.scala)
at com.databricks.spark.xml.XmlRelation$$anonfun$1.apply(XmlRelation.scala:46)
at com.databricks.spark.xml.XmlRelation$$anonfun$1.apply(XmlRelation.scala:46)
at scala.Option.getOrElse(Option.scala:120)
at com.databricks.spark.xml.XmlRelation.<init>(XmlRelation.scala:45)
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:66)
at com.databricks.spark.xml.DefaultSource.createRelation(DefaultSource.scala:44)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132)

有人可以指出错误吗?对我来说似乎是一个依赖问题。spark-core 似乎工作正常但不是 spark-sql 我之前有 scala 2.12 但由于 spark-core 未解决而更改为 2.11

4

1 回答 1

2

tl;博士我认为这是一个 Scala 版本不匹配的问题。使用 spark-xml 0.4.1

引用 spark-xml 的要求(突出显示我的):

这个库需要 Spark 2.0+ for 0.4.x。

对于适用于Spark 1.x的版本,请检查branch-0.3

这对我说 spark-xml 0.3.3 适用于 Spark 1.x(不是您要求的 Spark 2.0.0)。

于 2017-10-12T05:35:44.860 回答