我们已经成功地将 ELK 堆栈设置到我们的生产环境中。我们还可以在我们的 Kibana 服务器上看到日志(日志是非结构化的)输出。
一切对我们来说都很好。但我们唯一关心的是 kibana 中的消息是针对写入给定日志文件的每一行而构建的。
问题:
那么,有没有办法通过最大行数(在将它们发送到 logstash 或弹性搜索之前的文件节拍中)合并(行合并)日志消息,这将被视为 ElasticSearch / Kibana / Logstash 中的 1 个事件。
注意:请注意日志消息是非结构化的(其中没有特定的正则表达式模式)。所以不能用这个。但是我确实尝试了max-lines方法,但是 kibana 中的事件显示了单行的文档。
例子。
如果我的日志(日志文件)有这样的条目
Sending ... 0 .. 2016-02-17 13:20:13 +0530
Sending ... 1 .. 2016-02-17 13:20:13 +0530
Sending ... 2 .. 2016-02-17 13:20:14 +0530
Sending ... 3 .. 2016-02-17 13:20:14 +0530
Sending ... 4 .. 2016-02-17 13:20:14 +0530
Sending ... 5 .. 2016-02-17 13:20:15 +0530
Sending ... 6 .. 2016-02-17 13:20:15 +0530
Sending ... 7 .. 2016-02-17 13:20:16 +0530
Sending ... 8 .. 2016-02-17 13:20:16 +0530
Sending ... 9 .. 2016-02-17 13:20:16 +0530
Sending ... 10 .. 2016-02-17 13:20:17 +0530
Sending ... 11 .. 2016-02-17 13:20:17 +0530
Sending ... 12 .. 2016-02-17 13:20:18 +0530
Sending ... 13 .. 2016-02-17 13:20:18 +0530
Sending ... 14 .. 2016-02-17 13:20:18 +0530
Sending ... 15 .. 2016-02-17 13:20:19 +0530
Sending ... 16 .. 2016-02-17 13:20:19 +0530
Sending ... 17 .. 2016-02-17 13:20:20 +0530
Sending ... 18 .. 2016-02-17 13:20:20 +0530
Sending ... 19 .. 2016-02-17 13:20:20 +0530
Sending ... 20 .. 2016-02-17 13:20:21 +0530
Sending ... 21 .. 2016-02-17 13:20:21 +0530
Sending ... 22 .. 2016-02-17 13:20:22 +0530
Sending ... 23 .. 2016-02-17 13:20:22 +0530
Sending ... 24 .. 2016-02-17 13:20:22 +0530
Sending ... 25 .. 2016-02-17 13:20:23 +0530
Sending ... 26 .. 2016-02-17 13:20:23 +0530
Sending ... 27 .. 2016-02-17 13:20:24 +0530
Sending ... 28 .. 2016-02-17 13:20:24 +0530
Sending ... 29 .. 2016-02-17 13:20:24 +0530
Sending ... 30 .. 2016-02-17 13:20:25 +0530
Sending ... 31 .. 2016-02-17 13:20:25 +0530
Sending ... 32 .. 2016-02-17 13:20:26 +0530
Sending ... 33 .. 2016-02-17 13:20:26 +0530
Sending ... 34 .. 2016-02-17 13:20:26 +0530
Sending ... 35 .. 2016-02-17 13:20:27 +0530
Sending ... 36 .. 2016-02-17 13:20:27 +0530
Sending ... 37 .. 2016-02-17 13:20:28 +0530
Sending ... 38 .. 2016-02-17 13:20:28 +0530
Sending ... 39 .. 2016-02-17 13:20:29 +0530
Sending ... 40 .. 2016-02-17 13:20:29 +0530
Sending ... 41 .. 2016-02-17 13:20:30 +0530
我希望 File beat 将它们分组(更好的词合并它们)
(示例:filebeat 中的配置将合并它们。)
所以最终发送到logstash/elastic的事件应该是这样的
1 个事件(消息为 ..)
Sending ... 0 .. 2016-02-17 13:20:13 +0530
Sending ... 1 .. 2016-02-17 13:20:13 +0530
Sending ... 2 .. 2016-02-17 13:20:14 +0530
Sending ... 3 .. 2016-02-17 13:20:14 +0530
Sending ... 4 .. 2016-02-17 13:20:14 +0530
Sending ... 5 .. 2016-02-17 13:20:15 +0530
Sending ... 6 .. 2016-02-17 13:20:15 +0530
Sending ... 7 .. 2016-02-17 13:20:16 +0530
Sending ... 8 .. 2016-02-17 13:20:16 +0530
Sending ... 9 .. 2016-02-17 13:20:16 +0530
Sending ... 10 .. 2016-02-17 13:20:17 +0530
Sending ... 11 .. 2016-02-17 13:20:17 +0530
Sending ... 12 .. 2016-02-17 13:20:18 +0530
Sending ... 13 .. 2016-02-17 13:20:18 +0530
Sending ... 14 .. 2016-02-17 13:20:18 +0530
Sending ... 15 .. 2016-02-17 13:20:19 +0530
2 个事件(消息为 .. )
Sending ... 16 .. 2016-02-17 13:20:19 +0530
Sending ... 17 .. 2016-02-17 13:20:20 +0530
Sending ... 18 .. 2016-02-17 13:20:20 +0530
Sending ... 19 .. 2016-02-17 13:20:20 +0530
Sending ... 20 .. 2016-02-17 13:20:21 +0530
Sending ... 21 .. 2016-02-17 13:20:21 +0530
Sending ... 22 .. 2016-02-17 13:20:22 +0530
Sending ... 23 .. 2016-02-17 13:20:22 +0530
Sending ... 24 .. 2016-02-17 13:20:22 +0530
Sending ... 25 .. 2016-02-17 13:20:23 +0530
Sending ... 26 .. 2016-02-17 13:20:23 +0530
Sending ... 27 .. 2016-02-17 13:20:24 +0530
Sending ... 28 .. 2016-02-17 13:20:24 +0530
Sending ... 29 .. 2016-02-17 13:20:24 +0530
Sending ... 30 .. 2016-02-17 13:20:25 +0530
Sending ... 31 .. 2016-02-17 13:20:25 +0530
Sending ... 32 .. 2016-02-17 13:20:26 +0530
等等 ...
但不幸的是,它只是为每一行创建一个事件。请参阅随附的屏幕截图。
这里我的 Filebeat 配置看起来像这样(再次不能使用正则表达式,因为日志通常是非结构化的,上面给定的日志只是示例。)
想法 ?
注:文件节拍版本 1.1.0