➜ k get pods -n edna
NAME READY STATUS RESTARTS AGE
airflow-79d5f59644-dd4k7 1/1 Running 0 16h
airflow-worker-67bcf7844b-rq7r8 1/1 Running 0 22h
backend-65bcb6546-wvvqj 1/1 Running 0 2d16h
所以在airflow-79d5f59644-dd4k7 pod中运行的气流试图从气流工作者(芹菜/python,它运行一个简单的基于烧瓶的网络服务器处理日志)中提取日志,它不能因为域名airflow-worker-67bcf7844b-rq7r8在气流 79d5f59644-dd4k7内部未解决
*** Log file does not exist: /usr/local/airflow/logs/hello_world/hello_task/2020-07-14T22:05:12.123747+00:00/1.log
*** Fetching from: http://airflow-worker-67bcf7844b-rq7r8:8793/log/hello_world/hello_task/2020-07-14T22:05:12.123747+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='airflow-worker-67bcf7844b-rq7r8', port=8793): Max retries exceeded with url: /log/hello_world/hello_task/2020-07-14T22:05:12.123747+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fd37d6a9790>: Failed to establish a new connection: [Errno -2] Name or service not known'))
我怎样才能使这项工作?
我知道 Airflow 有远程日志记录到 s3,但是有没有办法通过 POD 随机 hasotnames 路由请求?
我创建了一个 NodeType 服务,但气流对该服务的 DNS 名称一无所知,并试图通过气流工作人员的主机名访问日志(由 Celery 报告)。
➜ k get pods -n edna
NAME READY STATUS RESTARTS AGE
airflow-79d5f59644-dd4k7 1/1 Running 0 16h
airflow-worker-67bcf7844b-rq7r8 1/1 Running 0 22h
backend-65bcb6546-wvvqj 1/1 Running 0 2d17h
kubectl get pods -n edna -l app=edna-airflow-worker \
-o go-template='{{range .items}}{{.status.podIP}}{{"\n"}}{{end}}'
'Tipz:' kgp -n edna -l app=edna-airflow-worker \ -o go-template='{{range .items}}{{.status.podIP}}{{" "}}{{end}}'
10.0.101.120
进入气流 79d5f59644-dd4k7吊舱
k exec -ti -n edna airflow-79d5f59644-dd4k7 bash
[DEV] airflow-79d5f59644-dd4k7 app # curl -L http://airflow-worker-67bcf7844b-rq7r8:8793/log/hello_world/hello_task/2020-07-14T21:59:01.400678+00:00/1.log
curl: (6) Could not resolve host: airflow-worker-67bcf7844b-rq7r8; Unknown error
[DEV] airflow-79d5f59644-dd4k7 app # curl -L http://10.0.101.120:8793/log/hello_world/hello_task/2020-07-14T21:59:01.400678+00:00/1.log
[2020-07-14 21:59:07,257] {{taskinstance.py:669}} INFO - Dependencies all met for <TaskInstance: hello_world.hello_task 2020-07-14T21:59:01.400678+00:00 [queued]>
[2020-07-14 21:59:07,341] {{taskinstance.py:669}} INFO - Dependencies all met for <TaskInstance: hello_world.hello_task 2020-07-14T21:59:01.400678+00:00 [queued]>
[2020-07-14 21:59:07,342] {{taskinstance.py:879}} INFO -
--------------------------------------------------------------------------------
[2020-07-14 21:59:07,342] {{taskinstance.py:880}} INFO - Starting attempt 1 of 1
[2020-07-14 21:59:07,342] {{taskinstance.py:881}} INFO -
--------------------------------------------------------------------------------
[2020-07-14 21:59:07,348] {{taskinstance.py:900}} INFO - Executing <Task(PythonOperator): hello_task> on 2020-07-14T21:59:01.400678+00:00
[2020-07-14 21:59:07,351] {{standard_task_runner.py:53}} INFO - Started process 5795 to run task
[2020-07-14 21:59:07,912] {{logging_mixin.py:112}} INFO - Running %s on host %s <TaskInstance: hello_world.hello_task 2020-07-14T21:59:01.400678+00:00 [running]> airflow-worker-67bcf7844b-rq7r8
[2020-07-14 21:59:07,989] {{logging_mixin.py:112}} INFO - Hello world! This is really cool!
[2020-07-14 21:59:07,989] {{python_operator.py:114}} INFO - Done. Returned value was: Hello world! This is really cool!
[2020-07-14 21:59:08,161] {{taskinstance.py:1065}} INFO - Marking task as SUCCESS.dag_id=hello_world, task_id=hello_task, execution_date=20200714T215901, start_date=20200714T215907, end_date=20200714T215908
[2020-07-14 21:59:17,070] {{logging_mixin.py:112}} INFO - [2020-07-14 21:59:17,070] {{local_task_job.py:103}} INFO - Task exited with return code 0
[DEV] airflow-79d5f59644-dd4k7 app #