18

我写过火花工作:

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
    val sc = new SparkContext(conf)
    val ctx = new org.apache.spark.sql.SQLContext(sc)
    import ctx.implicits._

    case class Person(age: Long, city: String, id: String, lname: String, name: String, sex: String)
    case class Person2(name: String, age: Long, city: String)

    val persons = ctx.read.json("/tmp/persons.json").as[Person]
    persons.printSchema()
  }
}

在 IDE 中,当我运行 main 函数时,出现 2 个错误:

Error:(15, 67) Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._  Support for serializing other types will be added in future releases.
    val persons = ctx.read.json("/tmp/persons.json").as[Person]
                                                                  ^

Error:(15, 67) not enough arguments for method as: (implicit evidence$1: org.apache.spark.sql.Encoder[Person])org.apache.spark.sql.Dataset[Person].
Unspecified value parameter evidence$1.
    val persons = ctx.read.json("/tmp/persons.json").as[Person]
                                                                  ^

但在 Spark Shell 中,我可以运行此作业而不会出现任何错误。问题是什么?

4

3 回答 3

37

错误消息说Encoder无法参加Person案例课程。

Error:(15, 67) Unable to find encoder for type stored in a Dataset.  Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._  Support for serializing other types will be added in future releases.

将 case 类的声明移到SimpleApp.

于 2016-01-11T07:02:35.853 回答
4

如果您添加sqlContext.implicits._spark.implicits._输入SimpleApp(顺序无关紧要),您会遇到同样的错误。

删除一个或另一个将是解决方案:

val spark = SparkSession
  .builder()
  .getOrCreate()

val sqlContext = spark.sqlContext
import sqlContext.implicits._ //sqlContext OR spark implicits
//import spark.implicits._ //sqlContext OR spark implicits

case class Person(age: Long, city: String)
val persons = ctx.read.json("/tmp/persons.json").as[Person]

使用Spark 2.1.0测试

有趣的是,如果您将相同的对象添加两次,您将不会遇到问题。

于 2017-02-28T14:18:57.980 回答
3

@Milad Khajavi

在对象 SimpleApp 之外定义 Person 案例类。此外,在 main() 函数中添加 import sqlContext.implicits._ 。

于 2018-08-30T07:23:10.303 回答