我正在使用 Kinesis firehose 通过 S3 将数据传输到 Redshift。我有一个非常简单的 csv 文件,看起来像这样。firehose 将其放入 s3,但 Redshift 出错,并出现 Delimiter not found 错误。我已经查看了与此错误相关的所有帖子,但我确保包含分隔符。
文件
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:23:56.986397,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:02.061263,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:07.143044,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:12.217930,848.78
或者
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:48:59.993260","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:07.034945","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:12.306484","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:18.020833","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:24.203464","852.12"
红移表
CREATE TABLE stockvalue
( symbol VARCHAR(4),
streamdate VARCHAR(20),
writedate VARCHAR(26),
stockprice VARCHAR(6)
);
有人可以指出文件可能有什么问题。我在字段之间添加了一个逗号。目标表中的所有列都是 varchar,因此应该没有数据类型错误的原因。此外,列长度在文件和红移表之间完全匹配。我已经尝试在双引号中嵌入列并且没有。