1

我在尝试加载 Avro 文件(大小 134 KB)时遇到此错误。我的 pom 依赖项如下。我正在从一个运行良好的 protobuf 消息创建这个 Avro。

pom依赖:

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro-protobuf</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>3.0.0</version>
</dependency>

例外 :

Exception in thread "main" java.lang.StackOverflowError
    at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)
    at scala.collection.Iterator$class.exists(Iterator.scala:919)
    at scala.collection.AbstractIterator.exists(Iterator.scala:1336)
    at scala.collection.IterableLike$class.exists(IterableLike.scala:77)
    at scala.collection.AbstractIterable.exists(Iterable.scala:54)
    at com.databricks.spark.avro.SchemaConverters$.toSqlType(SchemaConverters.scala:75)
    at com.databricks.spark.avro.SchemaConverters$$anonfun$1.apply(SchemaConverters.scala:56)
    at com.databricks.spark.avro.SchemaConverters$$anonfun$1.apply(SchemaConverters.scala:55)
4

0 回答 0