我正在使用 logstash 从我的 ASA 5505 收集日志,我想提取 ip 源;ip目的地;端口源;在 kibana 中使用它们的端口目的地。我应该在过滤器中写什么。
这是一个示例日志消息:
<166>Aug 20 2014 05:51:34: %ASA-6-302014: Teardown TCP connection 8440 for inside:192.168.2.209/51483 to outside:104.16.13.8/80 duration 0:00:53 bytes 13984 TCP FINs
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33388 to outside:192.168.1.101/33388 duration 0:04:00
<167>Aug 20 2014 06:50:55: %ASA-7-609002: Teardown local-host outside:74.125.206.95 duration 0:04:00
<166>Aug 20 2014 06:50:55: %ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.2.209/33390 to outside:192.168.1.101/33390 duration 0:04:00
<166>Aug 20 2014 06:50:54: %ASA-6-302014: Teardown TCP connection 10119 for inside:192.168.2.209/48466 to outside:173.194.66.84/443 duration 0:05:34 bytes 3160 TCP FINs
<167>Aug 20 2014 06:50:53: %ASA-7-710005: UDP request discarded from 192.168.1.199/3205 to outside:255.255.255.255/3206
这是正在使用的过滤器:
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ] }
syslog_pri { }
date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
谢谢