我正在使用气流(docker)运行造纸厂命令。该脚本存储在 S3 上,我使用 papermill 的 Python 客户端运行它。它最终导致一个完全无法理解的错误:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/ipython_genutils/ipstruct.py", line 132, in __getattr__
result = self[key]
KeyError: 'kernelspec'
我尝试查看文档但徒劳无功。
我正在使用的代码是运行 papermill 命令是:
import time
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from mypackage.datastore import db
from mypackage.workflow.transform.jupyter_notebook import run_jupyter_notebook
dag_id = "jupyter-test-dag"
default_args = {
'owner': "aviral",
'depends_on_past': False,
'start_date': "2019-02-28T00:00:00",
'email': "aviral@some_org.com",
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=5),
'provide_context': True
}
dag = DAG(
dag_id,
catchup=False,
default_args=default_args,
schedule_interval=None,
max_active_runs=1
)
def print_context(ds, **kwargs):
print(kwargs)
print(ds)
return 'Whatever you return gets printed in the logs'
def run_python_jupyter(**kwargs):
run_jupyter_notebook(
script_location=kwargs["script_location"]
)
create_job_task = PythonOperator(
task_id="create_job",
python_callable=run_python_jupyter,
dag=dag,
op_kwargs={
"script_location": "s3://some_bucket/python3_file_write.ipynb"
}
)
globals()[dag_id] = dag
功能run_jupyter_notebook
是:
def run_jupyter_notebook(**kwargs):
"""Runs Jupyter notebook"""
script_location = kwargs.get('script_location', '')
if not script_location:
raise ValueError(
"Script location was not provided."
)
pm.execute_notebook(script_location, script_location.split(
'.ipynb')[0] + "_output" + ".ipynb")
我希望代码运行没有任何错误,因为我也在本地运行它(不使用 s3 路径,使用本地文件系统路径)