我的 BigQuery 连接器都在运行,但是我希望在 Cloud Composer 而不是 App Engine Flexible 上安排一些现有的 Docker 容器中的脚本。
我有以下脚本,似乎遵循我能找到的示例:
import datetime
from airflow import DAG
from airflow import models
from airflow.operators.docker_operator import DockerOperator
yesterday = datetime.datetime.combine(
datetime.datetime.today() - datetime.timedelta(1),
datetime.datetime.min.time())
default_args = {
# Setting start date as yesterday starts the DAG immediately
'start_date': yesterday,
# If a task fails, retry it once after waiting at least 5 minutes
'retries': 1,
'retry_delay': datetime.timedelta(minutes=5),
}
schedule_interval = '45 09 * * *'
dag = DAG('xxx-merge', default_args=default_args, schedule_interval=schedule_interval)
hfan = DockerOperator(
task_id = 'hfan',
image = 'gcr.io/yyyyy/xxxx'
)
...但是在尝试运行时,它会在 Web UI 中告诉我:
Broken DAG: [/home/airflow/gcs/dags/xxxx.py] No module named docker
Docker 是否未配置为在 Cloud Composer 运行的 Kubernetes 集群内工作?还是我只是在语法中遗漏了什么?