2

我试图在 AWS EMR 上运行 Hudi deltastreamer。按照此博客中的步骤操作。https://cwiki.apache.org/confluence/pages/viewrecentblogposts.action?key=HUDI

但是当我运行下面的火花提交时,错误来了:

 Exception in thread "main" org.apache.hudi.com.beust.jcommander.ParameterException: Was passed main parameter '--table-type' but no main parameter was defined in your arg class
            at org.apache.hudi.com.beust.jcommander.JCommander.initMainParameterValue(JCommander.java:936)
            at org.apache.hudi.com.beust.jcommander.JCommander.parseValues(JCommander.java:752)
            at org.apache.hudi.com.beust.jcommander.JCommander.parse(JCommander.java:340)
            at org.apache.hudi.com.beust.jcommander.JCommander.parse(JCommander.java:319)

我运行的命令如下:

spark-submit --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer \
--packages org.apache.spark:spark-avro_2.11:2.4.4 \
 --master yarn --deploy-mode client /usr/lib/hudi/hudi-utilities-bundle.jar \
--table-type COPY_ON_WRITE --source-ordering-field payment_date --source-class org.apache.hudi.utilities.sources.ParquetDFSSource \
--target-base-path s3://sakila-db/hudi-payment \
--target-table hudi-payment \
--transformer-class org.apache.hudi.utilities.transform.AWSDmsTransformer \
--payload-class org.apache.hudi.payload.AWSDmsAvroPayload \
--hoodie-conf hoodie.datasource.write.recordkey.field=order_id,hoodie.datasource.write.partitionpath.field=staff_id,hoodie.deltastreamer.source.dfs.root=s3://sakila-db/sakila/payment

请帮忙。

4

1 回答 1

2

基于EMR版本,目前支持的 hudi 版本是 0.5.0-incubating。您遵循的步骤适用于 0.5.1,其中参数是0.5.0--table-type中旧版本的新名称。--storage-type

如果您使用的是 0.5.0 ,请尝试使用--storage-type而不是重新运行--table-type

于 2020-02-03T19:38:40.287 回答