0

我使用带有 scala_2.12 的 Spark SQL 3.0。我将数据插入冰山表并成功从表中读取数据。当我尝试通过 spark SQL 从表中删除一条错误记录时,日志显示异常。github 的 apache iceberg 的 issue 1444 显示了上一版本的 iceberg 支持行级删除。为什么我删除不成功?我使用的主要冰山版本是 0.10.0 。包 org.apache.iceberg.iceberg-hive 版本是 0.9.1 。请帮忙!我的 Spark SQL 代码段是:

public static void deleteSingleDataWithoutCatalog3(){
    // SparkSQL Configure
    SparkConf sparkSQLConf = new SparkConf();
    // 'hadoop_prod' is name of the catalog,which is used in accessing table
    sparkSQLConf.set("spark.sql.catalog.hadoop_prod", "org.apache.iceberg.spark.SparkCatalog");
    sparkSQLConf.set("spark.sql.catalog.hadoop_prod.type", "hadoop");
    sparkSQLConf.set("spark.sql.catalog.hadoop_prod.warehouse", "hdfs://hadoop01:9000/warehouse_path/");
    sparkSQLConf.set("spark.sql.sources.partitionOverwriteMode", "dynamic");
    
    SparkSession spark = SparkSession.builder().config(sparkSQLConf).master("local[2]").getOrCreate();
    // String selectDataSQLALL = "select * from  hadoop_prod.xgfying.booksSpark3 ";
    String deleteSingleDataSQL = "DELETE FROM  hadoop_prod.xgfying.booksSpark3 where price=33 ";
    // spark.sql(deleteSingleDataSQL);
    spark.table("hadoop_prod.xgfying.booksSpark3").show();
    spark.sql(deleteSingleDataSQL);
    spark.table("hadoop_prod.xgfying.booksSpark3").show();
}

当代码运行时,异常消息是:

......
Exception in thread "main" java.lang.IllegalArgumentException: Failed to cleanly delete data files matching: ref(name="price") == 33
        at org.apache.iceberg.spark.source.SparkTable.deleteWhere(SparkTable.java:168)
......
Caused by: org.apache.iceberg.exceptions.ValidationException: Cannot delete file where some, but not all, rows match filter ref(name="price") == 33: hdfs://hadoop01:9000/warehouse_path/xgfying/booksSpark3/data/title=Gone/00000-1-9070110f-35f8-4ee5-8047-cca2a1caba1f-00001.parquet
......
4

1 回答 1

1

我知道这是一个相当老的问题,我最近遇到了一个类似的问题,我可以通过将 spark.sql.extension 添加到 spark config 来解决它

--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions 
于 2021-07-12T15:27:13.053 回答