我正在使用 GCP 作曲家来运行算法,并且在流结束时我想运行一个任务,该任务将执行多个操作,将文件和文件夹从卷复制和删除到存储桶我正在尝试执行这些复制和删除通过kubernetespodoperator
. 我很难找到使用“cmds”运行多个命令的正确方法我也尝试使用“cmds”和“arguments”。这是我KubernetesPodOperator
和我尝试过的 cmds 和参数组合:
post_algo_run = kubernetes_pod_operator.KubernetesPodOperator(
task_id="multi-coher-post-operations",
name="multi-coher-post-operations",
namespace="default",
image="google/cloud-sdk:alpine",
### doesn't work ###
cmds=["gsutil", "cp", "/data/splitter-output\*.csv", "gs://my_bucket/data" , "&" , "gsutil", "rm", "-r", "/input"],
#Error:
#[2022-01-27 09:31:38,407] {pod_manager.py:197} INFO - CommandException: Destination URL must name a directory, bucket, or bucket
#[2022-01-27 09:31:38,408] {pod_manager.py:197} INFO - subdirectory for the multiple source form of the cp command.
####################
### doesn't work ###
# cmds=["gsutil", "cp", "/data/splitter-output\*.csv", "gs://my_bucket/data ;","gsutil", "rm", "-r", "/input"],
# [2022-01-27 09:34:06,865] {pod_manager.py:197} INFO - CommandException: Destination URL must name a directory, bucket, or bucket
# [2022-01-27 09:34:06,866] {pod_manager.py:197} INFO - subdirectory for the multiple source form of the cp command.
####################
### only preform the first command - only copying ###
# cmds=["bash", "-cx"],
# arguments=["gsutil cp /data/splitter-output\*.csv gs://my_bucket/data","gsutil rm -r /input"],
# [2022-01-27 09:36:09,164] {pod_manager.py:197} INFO - + gsutil cp '/data/splitter-output*.csv' gs://my_bucket/data
# [2022-01-27 09:36:11,200] {pod_manager.py:197} INFO - Copying file:///data/splitter-output\Coherence Results-26-Jan-2022-1025Part1.csv [Content-Type=text/csv]...
# [2022-01-27 09:36:11,300] {pod_manager.py:197} INFO - / [0 files][ 0.0 B/ 93.0 KiB]
# / [1 files][ 93.0 KiB/ 93.0 KiB]
# [2022-01-27 09:36:11,302] {pod_manager.py:197} INFO - Operation completed over 1 objects/93.0 KiB.
# [20 22-01-27 09:36:12,317] {kubernetes_pod.py:459} INFO - Deleting pod: multi-coher-post-operations.d66b4c91c9024bd289171c4d3ce35fdd
####################
volumes=[
Volume(
name="nfs-pvc",
configs={
"persistentVolumeClaim": {"claimName": "nfs-pvc"}
},
)
],
volume_mounts=[
VolumeMount(
name="nfs-pvc",
mount_path="/data/",
sub_path=None,
read_only=False,
)
],
)