0

我正在使用 Logstash 将 JSON 消息输出到 API。我正在从日志文件中读取日志。我的配置工作正常,它还将所有消息发送到 API。以下是示例日志文件:

日志文件:

2014 Jun 01 18:57:34:158 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300009 BW Plugins: version 5.10.0, build V48, 2012-6-3 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300010 XML Support: TIBCOXML Version 5.51.500.003 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300011 Java version: Java HotSpot(TM) Server VM 20.5-b03 
2014 Jun 01 18:57:34:162 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300012 OS version: i386 Linux 3.11.0-12-generic 
2014 Jun 01 18:57:41:018 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:41:027 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:41:408 GMT +5 BW.Customer_01_001_009-Process_Archive Info [BW-Core] BWENGINE-300013 Tibrv string encoding: ISO8859-1 
2014 Jun 01 18:57:42:408 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:408 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:555 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:555 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:557 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100118 
2014 Jun 01 18:57:42:557 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100206 
2014 Jun 01 18:57:42:595 GMT +5 BW.Customer_01_001_009-Process_Archive Warn [BW_Core]  Duplicate message map entry for BW-HTTP-100118 

我正在使用 grok 模式来解析这个日志文件,以下是我的示例配置文件:

配置文件:

filter {
        if [type] == "bw5applog" {
        grok {
            match => [ "message", "(?<log_timestamp>%{YEAR}\s%{MONTH}\s%{MONTHDAY}\s%{TIME}:\d{3})\s(?<log_Timezone>%{DATA}\s%{DATA})\s(?<log_MessageTitle>%{DATA})(?<MessageType>%{LOGLEVEL})%{SPACE}\[%{DATA:ProcessName}\]%{SPACE}%{GREEDYDATA:Message}" ]
            add_tag => [ "grokked" ]        
        }
        mutate {
          gsub => [
             "TimeStamp", "\s", "T",
             "TimeStamp", ",", "."
           ]
        }
        if !( "_grokparsefailure" in [tags] ) {
            grok{
                    match => [ "message", "%{GREEDYDATA:StackTrace}" ]
                    add_tag => [ "grokked" ]    
                }
            date {
                    match => [ "timestamp", "yyyy MMM dd HH:mm:ss:SSS" ]
                    target => "TimeStamp"
                    timezone => "UTC"
                }
        }
     }
 }

我能够根据我的要求解析完整的日志条目,但我想格式化日期。

问题陈述:

目前,我从解析的日志条目中获取以下格式的日期:

log_timestamp:  2014·May·28·12:07:35:927

但是我的 API 期望日期的格式如下:

预期输出:

log_timestamp:  2014-05-28T12:07:35:927

如何通过使用上述过滤器配置来实现这一点,我尝试使用以下配置做一些事情,但我无法成功。

4

1 回答 1

1

您在错误的字段上应用日期过滤器。而不是timestamp,您必须将其应用于log_timestamp包含您要解析的日期的字段:

date {
        match => [ "log_timestamp", "yyyy MMM dd HH:mm:ss:SSS" ]
        target => "log_timestamp"
        timezone => "UTC"
}

此外,mutate 过滤器是无用的,因为它应用于不存在的字段 ( Timestamp)。

于 2016-06-17T14:16:57.453 回答