我有一个执行 gbq 作业以将 csv 文件 f 加载到 BigQuery 上的表的 python 脚本。所有数据都写入一列,但我希望将其加载到每一列中。我尝试了自动检测,但它也没有帮助。我的csv:
id,first_name,username,last_name,chat_username,chat_id,forward_date,message_text
231125223~Just~koso~swissborg_bounty~-1001368946079~1517903147~test
481895079~Emerson~EmersonEmory~swissborg_bounty~-1001368946079~1517904387~picture
316560356~Ken Sam~ICOnomix~swissborg_bounty~-1001368946079~1517904515~Today
这是我的代码:
from google.cloud.bigquery import Client
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '***.json'
os.environ['GOOGLE_CLOUD_DISABLE_GRPC'] = 'True'
from google.cloud import bigquery
dataset_name = 'test_temporary_dataset'
table_name='table_telega'
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
#table.reload()
job_config = bigquery.LoadJobConfig()
job_config.source_format = 'text/csv'
job_config.skip_leading_rows = 1
job_config.autodetect = True
job_config.allow_jagged_rows=True
job_config.allow_quoted_newlines=True
job_config.fieldDelimiter='~'
with open('tele2.csv', 'rb') as source_file:
#job = table.upload_from_file(source_file, source_format='text/csv')
job=bigquery_client.load_table_from_file(source_file, table, job_config=job_config)
job.result()
如何逐列正确加载csv