1

I write new log files to a Google Cloud Storage bucket every 2-3 minutes with data from my webserver (pipe-separated-values). I have thousands of ~1MB files in a single Google Cloud Storage bucket, and want to load all the files into a BigQuery table.

The "bq load" command seems to require individual files, and can't take an entire bucket, or bucket with prefix.

What's the best way to load thousands of files in a gs bucket? Do I really have to get the URI of every single file, as opposed to just specifying the bucket name or bucket and prefix to BigQuery?

4

1 回答 1

5

您可以使用 glob 样式的通配符。例如gs://bucket/prefix*.txt

于 2013-05-23T01:03:52.927 回答