请尝试从datalab 示例笔记本存储库中获取的以下工作示例。
import datalab.bigquery as bq
import datalab.storage as storage
import pandas as pd
from StringIO import StringIO
# Create the schema, conveniently using a DataFrame example.
%storage read --object gs://cloud-datalab-samples/cars.csv --variable cars
df = pd.read_csv(StringIO(cars))
schema = bq.Schema.from_dataframe(df)
# Create the dataset
bq.Dataset('sample').create()
# Create the table
sample_table = bq.Table('sample.cars').create(schema = schema, overwrite = True)
# Load csv file from GCS to Google BigQuery
sample_table.load('gs://cloud-datalab-samples/cars.csv', mode='append',
source_format = 'csv', csv_options=bq.CSVOptions(skip_leading_rows = 1))
请注意:如果您运行的是旧版本的 datalab,您可能必须使用import gcp.bigquery as bq
而不是import datalab.bigquery as bq
您可能会在BigQuery 控制台的“作业历史记录”页面上看到其他错误消息信息。