2

最近我已将气流升级到更新的可用版本 2.2.1,并得到许多用户使用气流的奇怪错误。

刷新 2-3 次后,我可以看到网络服务器 UI 页面,但它在每一个第一个操作中都会中断。

我为此使用了 LocalExecutor。所有配置都与我去年用于生产的配置相同。

此安装有两个不同之处:

  1. 使用 python pip commond 而不是标准设置安装并作为服务运行。
  2. 不同(升级)版本 - 2.2.1
Something bad has happened.

Airflow is used by many users, and it is very likely that others had similar problems and you can easily find
a solution to your problem.

Consider following these steps:

  * gather the relevant information (detailed logs with errors, reproduction steps, details of your deployment)

  * find similar issues using:
     * GitHub Discussions
     * GitHub Issues
     * Stack Overflow
     * the usual search engine you use on a daily basis

  * if you run Airflow on a Managed Service, consider opening an issue using the service support channels

  * if you tried and have difficulty with diagnosing and fixing the problem yourself, consider creating a bug report.
    Make sure however, to include all relevant details and results of your investigation so far.

Python version: 3.7.10
Airflow version: 2.2.1
Node: ip-1-2-3-4-my-ip-here
-------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/auth.py", line 51, in decorated
    return func(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 109, in view_func
    return f(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 72, in wrapper
    return f(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
    return func(*args, session=session, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/views.py", line 2316, in tree
    data = self._get_tree_data(dag_runs, dag, base_date, session=session)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/views.py", line 2173, in _get_tree_data
    for ti in dag.get_task_instances(start_date=min_date, end_date=base_date, session=session)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 67, in wrapper
    return func(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/models/dag.py", line 1339, in get_task_instances
    .join(TaskInstance.dag_run)
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2396, in join
    from_joinpoint=from_joinpoint,
  File "<string>", line 2, in _join
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/base.py", line 227, in generate
    fn(self, *args[1:], **kw)
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2577, in _join
    "been joined to; skipping" % prop
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 1380, in warn
    warnings.warn(msg, exc.SAWarning, stacklevel=2)
  File "/usr/lib64/python3.7/warnings.py", line 110, in _showwarnmsg
    msg.file, msg.line)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/settings.py", line 116, in custom_show_warning
    write_console.print(msg, soft_wrap=True)
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 1615, in print
    self._buffer.extend(new_segments)
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 825, in __exit__
    self._exit_buffer()
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 784, in _exit_buffer
    self._check_buffer()
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 1872, in _check_buffer
    self.file.write(text)
OSError: [Errno 5] Input/output error
4

0 回答 0