我正在将 PHP 与 Monolog 一起使用。我将日志输出到 JSON 文件并使用 Gelf 到 Logstash,然后将日志发送到 ElasticSearch。
我遇到的问题是我缺少extra
Kibana 中的对象,并且该tags
字段被解释为字符串而不是嵌套对象。
知道如何说服 Logstash/Kibana,因此内部 JSON 字段被解析为字段/对象而不是 JSON 字符串?
这就是它在 Kibana 中的样子。
{
"_index":"logstash-2018.08.30",
"_type":"doc",
"_id":"TtHbiWUBc7g5w1yM8X6f",
"_version":1,
"_score":null,
"_source":{
"ctxt_task":"taskName",
"@version":"1",
"http_method":"GET",
"user_agent":"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:61.0) Gecko/20100101 Firefox/61.0",
"level":6,
"message":"Finished task",
"tags":"{\"hostname\":\"28571f0dc7e1\",\"region\":\"eu-west-1\",\"environment\":\"local\",\"processUniqueId\":\"5b87a4d843c20\"}",
"url":"/assets/Logo.jpg",
"ctxt_controller":"ControllerName",
"memory_usage":"4 MB",
"referrer":"https://local.project.net/account/login",
"facility":"logger",
"memory_peak_usage":"4 MB",
"ctxt_timeElapsed":0.05187487602233887,
"@timestamp":"2018-08-30T08:03:37.386Z",
"ip":"172.18.0.1",
"ctxt_start":1535616217.33417,
"type":"gelf",
"host":"18571f0dc7e9",
"source_host":"172.18.0.8",
"server":"local.project.net",
"ctxt_end":1535616217.386045,
"version":"1.0"
},
"fields":{
"@timestamp":[
"2018-08-30T08:03:37.386Z"
]
},
"sort":[
1535616217386
]
}
我的日志看起来像:
{
"message":"Finished task",
"context":{
"controller":"ControllerName",
"task":"taskName",
"timeElapsed":0.02964186668395996,
"start":1535614742.840069,
"end":1535614742.869711,
"content":""
},
"level":200,
"level_name":"INFO",
"channel":"logger",
"datetime":{
"date":"2018-08-30 08:39:02.869850",
"timezone_type":3,
"timezone":"Europe/London"
},
"extra":{
"memory_usage":"14 MB",
"memory_peak_usage":"14 MB",
"tags":{
"hostname":"28571f0dc7e1",
"region":"eu-west-1",
"environment":"local",
"processUniqueId":"5b879f16be3f1"
}
}
}
我的logstash conf:
input {
tcp {
port => 5000
}
gelf {
port => 12201
type => gelf
codec => "json"
}
}
output {
elasticsearch {
hosts => "172.17.0.1:9201"
}
}
我的独白配置:
$gelfTransport = new \Gelf\Transport\UdpTransport(LOG_GELF_HOST, LOG_GELF_PORT);
$gelfPublisher = new \Gelf\Publisher($gelfTransport);
$gelfHandler = new \Monolog\Handler\GelfHandler($gelfPublisher, static::$logVerbosity);
$gelfHandler->setFormatter(new \Monolog\Formatter\GelfMessageFormatter());
// This is to prevent application from failing if `GelfHandler` fails for some reason
$ignoreErrorHandlers = new \Monolog\Handler\WhatFailureGroupHandler([
$gelfHandler
]);
$logger->pushHandler($ignoreErrorHandlers);
编辑:到目前为止,我的发现是这是由GelfMessageFormatter
将它们的数组转换为 JSON 引起的:
$val = is_scalar($val) || null === $val ? $val : $this->toJson($val);
Whennetcat
与嵌套的 JSON 一起使用,例如:
echo -n '{
"field": 1,
"nestedField1": {"nf1": 1.1, "nf2": 1.2, "2nestedfield":{"2nf1":1.11, "2nf2":1.12}}
}' | gzip -c | nc -u -w1 bomcheck-logstash 12201
然后 Kibana 中的数据看起来还可以