这是我的 json 日志文件。我正在尝试通过我的 logstash 将文件存储到我的弹性搜索中。
{"message":"IM: Orchestration","level":"info"}
{"message":"Investment Management","level":"info"}
这是我的文件beat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- D:/Development_Avecto/test-log/tn-logs/im.log
json.keys_under_root: true
json.add_error_key: true
processors:
- decode_json_fields:
fields: ["message"]
output.logstash:
hosts: ["localhost:5044"]
input{
beats {
port => "5044"
}
}
filter {
json {
source => "message"
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "data"
}
}
在 elasticserach 中无法查看。无法找到错误。文件节拍日志
2019-06-18T11:30:03.448+0530 INFO registrar/registrar.go:134 从 D:\Development_Avecto\filebeat-6.6.2-windows-x86_64\data\registry 加载注册商数据 2019-06-18T11:30:03.448 +0530 INFO registrar/registrar.go:141 从注册商加载的状态:10 2019-06-18T11:30:03.448+0530 WARN beater/filebeat.go:367 Filebeat 无法加载配置模块的摄取节点管道,因为Elasticsearch 输出未配置/启用。如果您已经加载了 Ingest Node 管道或正在使用 Logstash 管道,则可以忽略此警告。2019-06-18T11:30:03.448+0530 INFO crawler/crawler.go:72 加载输入:1 2019-06-18T11:30:03.448+0530 INFO log/input.go:138 配置路径:[D:\Development_Avecto \test-log\tn-logs\im.log] 2019-06-18T11:30:03.448+0530 信息输入/input.go: 114 开始输入类型:log;ID:16965758110699470044 2019-06-18T11:30:03.449+0530 INFO crawler/crawler.go:106 加载和启动输入已完成。启用的输入:1 2019-06-18T11:30:34.842+0530 INFO [monitoring] log/log.go:144 过去 30 秒内的非零指标 {"monitoring": {"metrics": {"beat":{ "cpu":{"system":{"ticks":312,"time":{"ms":312}},"total":{"ticks":390,"time":{"ms":390 },"value":390},"user":{"ticks":78,"time":{"ms":78}}},"handles":{"open":213},"info": {"ephemeral_id":"66983518-39e6-461c-886d-a1f99da6631d","uptime":{"ms":30522}},"memstats":{"gc_next":