我试图在 AWS EMR 上运行 Hudi deltastreamer。按照此博客中的步骤操作。https://cwiki.apache.org/confluence/pages/viewrecentblogposts.action?key=HUDI
但是当我运行下面的火花提交时,错误来了:
Exception in thread "main" org.apache.hudi.com.beust.jcommander.ParameterException: Was passed main parameter '--table-type' but no main parameter was defined in your arg class
at org.apache.hudi.com.beust.jcommander.JCommander.initMainParameterValue(JCommander.java:936)
at org.apache.hudi.com.beust.jcommander.JCommander.parseValues(JCommander.java:752)
at org.apache.hudi.com.beust.jcommander.JCommander.parse(JCommander.java:340)
at org.apache.hudi.com.beust.jcommander.JCommander.parse(JCommander.java:319)
我运行的命令如下:
spark-submit --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer \
--packages org.apache.spark:spark-avro_2.11:2.4.4 \
--master yarn --deploy-mode client /usr/lib/hudi/hudi-utilities-bundle.jar \
--table-type COPY_ON_WRITE --source-ordering-field payment_date --source-class org.apache.hudi.utilities.sources.ParquetDFSSource \
--target-base-path s3://sakila-db/hudi-payment \
--target-table hudi-payment \
--transformer-class org.apache.hudi.utilities.transform.AWSDmsTransformer \
--payload-class org.apache.hudi.payload.AWSDmsAvroPayload \
--hoodie-conf hoodie.datasource.write.recordkey.field=order_id,hoodie.datasource.write.partitionpath.field=staff_id,hoodie.deltastreamer.source.dfs.root=s3://sakila-db/sakila/payment
请帮忙。