1

I saw that we can use spark-submit --files to add files to the job. It's not a problem if I specify the absolute path like spark-submit --files /etc/somescript.sh.

But what directory it will search for if I just put spark-submit --files somescript.sh ? Is it the current workplace folder or all the classpath?

4

1 回答 1

1

Spark 将搜索当前文件夹中的指定文件。

您可以在 spark 的 github repo 上查看代码流。

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala#L402

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L2069

于 2021-08-25T17:33:08.780 回答