0

我在 spark 提交脚本 ( https://github.com/hortonworks-spark/spark-atlas-connector ) 中安装 Spark Atlas 连接器由于安全限制,我无法将 atlas-application.properties 放入火花/conf 存储库。

我在 spark-submit 中使用了两个选项:

--driver-class-path  "spark.driver.extraClassPath=hdfs:///directory_to_properties_files" \
--conf "spark.executor.extraClassPath=hdfs:///directory_to_properties_files" \

当我启动 spark-submit 时,我遇到了这个问题:

20/07/20 11:32:50 INFO ApplicationProperties: Looking for atlas-application.properties in classpath
20/07/20 11:32:50 INFO ApplicationProperties: Looking for /atlas-application.properties in classpath
20/07/20 11:32:50 INFO ApplicationProperties: Loading atlas-application.properties from null
4

1 回答 1

0

请查找 CDP Atals 配置文章。

https://community.cloudera.com/t5/Community-Articles/How-to-pass-atlas-application-properties-configuration-file/ta-p/322158

客户端模式:

spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client --driver-java-options="-Datlas.conf=/tmp/" /opt/cloudera/parcels/CDH/jars/spark-examples*.jar 10

集群模式:

sudo -u spark spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --files /tmp/atlas-application.properties --conf spark.driver.extraJavaOptions="-Datlas.conf=./" /opt/cloudera/parcels/CDH/jars/spark-examples*.jar 10
于 2021-08-23T06:31:09.247 回答