0

我找了很长一段时间来寻找一种使用本机 spark REST api 触发 dotnet spark 作业的方法,但没有找到任何东西,但最终通过 CLI 运行 spark-submit 并指定 --master spark:/ 找到了解决方案/spark:6066 然后比较在工作节点上执行的驱动程序启动命令。

如果这对其他人有帮助,这里是一个示例 post 命令的主体(使用 Postman),用于本机 Spark REST API 触发 dotnet spark 应用程序。

Spark REST API 端点:http://[localhost or dns name or ip address]:6066/v1/submissions/create

{ "action": "CreateSubmissionRequest",
  "appArgs": [
    "dotnet", "/path/to/your/compiled_dotnet_app.dll", "app arg 1", "app arg 2", "etc..."
  ],
 "appResource": "file:/opt/bitnami/spark/jars/microsoft-spark-3-2_2.12-2.1.0.jar",
 "clientSparkVersion": "3.2.1",
 "environmentVariables": {
    "SPARK_ENV_LOADED": "1"
  },
 "mainClass": "org.apache.spark.deploy.dotnet.DotnetRunner",
  "sparkProperties": {
    "spark.driver.supervise": "false",
    "spark.app.name": "org.apache.spark.deploy.dotnet.DotnetRunner",
    "spark.submit.deployMode": "cluster",
    "spark.master": "spark://spark:7077",
    "spark.jars":"file:/opt/bitnami/spark/jars/microsoft-spark-3-2_2.12-2.1.0.jar",
    "spark.submit.pyFiles":""
  }
}

HTH

4

0 回答 0