我正在尝试将数据从 Snowflake 卸载到 GCS,然后将 GCS 卸载到 bq 表。这是从雪花中卸载数据的代码。
```copy into @unload_stage/FF_TBL_UNLOAD20200906.csv.gz from
(
select *
from SF_DB.SF_TBLS.FF_TBL_UNLOAD
)
file_format = (
type=csv compression='gzip'
FIELD_DELIMITER = '|'
field_optionally_enclosed_by='"'
NULL_IF=()
EMPTY_FIELD_AS_NULL = TRUE
)
single = false
max_file_size=5300000000
header = false;```
然后我使用以下脚本将数据从 GCS 复制到 bq
```#!/bin/bash
date=20200906
echo "Removing FF_TBL_UNLOAD list with same date list...."
rm /home/varma/FF_TBL_UNLOADlist"$date".txt
echo "Listing FIlenames for FF_TBL_UNLOAD in GCS BUCKET...."
gsutil ls gs://syw_hs_inbound_outbound_data/FF_TBL_UNLOAD"$date"*.gz>>/home/varma/FF_TBL_UNLOADlist"$date".txt
echo "Starting Uploading Data into table from FF_TBL_UNLOAD$date list..."
if [ -s /home/varma/FF_TBL_UNLOADlist"$date".txt ]
then
while IFS= read -r line
do
echo "Uploading data for file $line"
bq load --noreplace --field_delimiter="|" hty-ttw-analysis:out.FF_TBL_UNLOAD $line
done < "/home/varma/FF_TBL_UNLOADlist${date}.txt"
else
echo "File is Empty"
fi```
它适用于除此表之外的所有表,我得到的错误是读取数据时出错,错误消息:CSV 表引用列位置 174,但从位置开始的行:136868 仅包含 94 列。
谁能帮我解决这个错误,我应该更改文件格式还是应该对我上传到 bq 的脚本进行一些更改