0

我正在尝试使用 Clickhouse Kafka Engine 来摄取数据。数据为 CSV 格式。在数据摄取期间,有时我会遇到异常

2018.01.08 08:41:47.016826 [ 3499 ] <Debug> StorageKafka (consumer_queue): Started streaming to 1 attached views
2018.01.08 08:41:47.016906 [ 3499 ] <Trace> StorageKafka (consumer_queue): Creating formatted reader
2018.01.08 08:41:49.680816 [ 3499 ] <Error> void DB::StorageKafka::streamThread(): Code: 117, e.displayText() = DB::Exception: Expected end of line, e.what() = DB::Exception, Stack trace:

0. clickhouse-server(StackTrace::StackTrace()+0x16) [0x3221296]
1. clickhouse-server(DB::Exception::Exception(std::string const&, int)+0x1f) [0x144a02f]
2. clickhouse-server() [0x36e6ce1]
3. clickhouse-server(DB::CSVRowInputStream::read(DB::Block&)+0x1a0) [0x36e6f60]
4. clickhouse-server(DB::BlockInputStreamFromRowInputStream::readImpl()+0x64) [0x36e3454]
5. clickhouse-server(DB::IProfilingBlockInputStream::read()+0x16e) [0x2bcae0e]
6. clickhouse-server(DB::KafkaBlockInputStream::readImpl()+0x6c) [0x32f6e7c]
7. clickhouse-server(DB::IProfilingBlockInputStream::read()+0x16e) [0x2bcae0e]
8. clickhouse-server(DB::copyData(DB::IBlockInputStream&, DB::IBlockOutputStream&, std::atomic<bool>*)+0x55) [0x35b3e25]
9. clickhouse-server(DB::StorageKafka::streamToViews()+0x366) [0x32f54f6]
10. clickhouse-server(DB::StorageKafka::streamThread()+0x143) [0x32f58c3]
11. clickhouse-server() [0x40983df]
12. /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba) [0x7f4d115d06ba]
13. /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f4d10bf13dd]

下面是表格

CREATE TABLE test.consumer_queue (ID Int32,  DAY Date) ENGINE = Kafka('broker-ip:port', 'clickhouse-kyt-test','clickhouse-kyt-test-group', '**CSV**')

CREATE TABLE test.consumer_request ( ID Int32,  DAY Date) ENGINE = MergeTree PARTITION BY DAY ORDER BY (DAY, ID) SETTINGS index_granularity = 8192

CREATE MATERIALIZED VIEW test.consumer_view TO test.consumer_request (ID Int32, DAY Date) AS SELECT ID, DAY FROM test.consumer_queue

CSV 数据

10034,"2018-01-05"
10035,"2018-01-05"
10036,"2018-01-05"
10037,"2018-01-05"
10038,"2018-01-05"
10039,"2018-01-05"

Clickhouse 服务器版本 1.1.54318。

4

2 回答 2

1

ClickHouse 似乎从 Kafka 读取了一批消息,然后尝试将所有这些消息解码为单个 CSV。此单个 CSV 中的消息应使用换行符分隔。所以所有消息的末尾都应该有换行符。

我不确定这是 ClickHouse 的功能还是错误。

您可以尝试仅向 kafka 发送一条消息,并检查它是否在 ClickHouse 中正确显示。

如果您使用脚本 kafka-console-producer.sh 向 Kafka 发送消息,则此脚本(ConsoleProducer.scala 类)从文件中读取行并将每一行发送到没有换行符的 Kafka 主题,因此无法正确处理此类消息.

如果您使用自己的脚本/应用程序发送消息,那么您可以尝试修改它并在每条消息的末尾添加换行符。这应该可以解决问题。或者,您可以为 Kafka 引擎使用另一种格式,例如 JSONEachRow。

于 2018-01-11T13:39:31.470 回答
-1

同意@mikhail 的回答,我猜,在 SETTINGS KAFKA 引擎中尝试 kafka_row_delimiter = '\n'

于 2019-07-19T09:11:30.240 回答