0

我正在使用 Argo Workflow,并希望生成 2 个单独的工件。如下定义输出工件,它告诉我path '/tmp' already mounted in inputs.artifacts.txt。如何将 2 个单独的工件挂载到单个目录(在本例中为/tmp)?

outputs:
  artifacts:
  - name: txt
    path: /tmp
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: test.txt.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
  - name: total-file-count
    path: /tmp
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: total-file-count.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
4

1 回答 1

1

path指要写入 S3 的工件的完整路径(不仅仅是找到文件的目录)。

要将两个工件写入 S3,请使用源文件的完整路径。假设文件名与键名匹配,这应该有效:

outputs:
  artifacts:
  - name: txt
    path: /tmp/test.txt.tgz
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: test.txt.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
  - name: total-file-count
    path: /tmp/total-file-count.tgz
    s3:
      endpoint: s3.amazonaws.com
      bucket: <My Bucket>
      key: total-file-count.tgz
      accessKeySecret:
        name: vault-data
        key: s3_access_key-0
      secretKeySecret:
        name: vault-data
        key: s3_secret_key-0
于 2021-01-12T15:50:41.957 回答