0

I am trying to find a solution to move files from an S3 bucket to Snowflake internal stage (not table directly) with Airflow but it seems that the PUT command is not supported with current Snowflake operator.

I know there are other options like Snowpipe but I want to showcase Airflow's capabilities. COPY INTO is also an alternative solution but I want to load DDL statements from files, not run them manually in Snowflake.

This is the closest I could find but it uses COPY INTO table:

https://artemiorimando.com/2019/05/01/data-engineering-using-python-airflow/

Also : How to call snowsql client from python

Is there any way to move files from S3 bucket to Snowflake internal stage through Airflow+Python+Snowsql?

Thanks!

4

1 回答 1

2

我建议您COPY INTO从 Airflow 中执行命令,直接从 S3 加载文件。没有将文件从 S3 获取到内部阶段而不将文件跳到另一台机器(如 Airflow 机器)的好方法。您将使用 SnowSQLGET从 S3 到本地,以及PUT从本地到 S3。执行PUT内部阶段的唯一方法是通过 SnowSQL。

于 2020-05-12T19:02:47.587 回答