0

我正在为我的用例探索argo 工作流程spark。是否有任何示例YAML显示如何使用Argo 工作流程spark job执行k8s

4

2 回答 2

4
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: hello-spark-
spec:
  entrypoint: sparkapp
  templates:
  - name: sparkapp
  container:
    image: sparkimage
    command: [sh]
    args: [
                    "-c",
                    "sh /opt/spark/bin/spark-submit.sh  \"--class\" \"org.apache.spark.examples.SparkPi\" \"/opt/spark/examples/jars/spark-examples_2.11-2.4.0.jar\" "
                ]

希望这可以帮助 !

于 2019-12-07T06:48:14.187 回答
0

这是一个运行 Spark 的 Pi 示例的示例,只需将 k8s api 的图像、类、url 的正确值替换为

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  name: wf-spark-pi
  namespace: spark
spec:
  entrypoint: sparkapp
  templates:
  - name: sparkapp
    container:
      image: Spark-Image
      imagePullPolicy: Always
      command: [sh]
      args:
      - /opt/spark/bin/spark-submit 
      - --master 
      - k8s://https://<K8S_API_TCP_ADDR>:<K8S_API_TCP_PORT>
      - --deploy-mode
      - cluster
      - --conf 
      - spark.kubernetes.namespace=spark
      - --conf
      - spark.kubernetes.container.image=Spark-Image
      - --conf
      - spark.kubernetes.driver.pod.name=spark
      - --conf 
      - spark.executor.instances=2
      - --class
      - org.apache.spark.examples.SparkPi
      - local:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar
      resources: {}      
    restartPolicy: OnFailure
于 2020-03-02T23:47:36.143 回答