1

我有一个 K8s 部署,它沿着它的 sidecar 部署应用程序的容器(一个 ASP.Net Core 应用程序)以进行日志记录。我使用 Serilog 的控制台接收器来登录标准输出。我还有一个 Config Map 来存储 fluentd 的配置。关于这部分的一篇非常好的文章是this

我想要做的是使用边车将日志从应用程序转发到 Elastic Search。我使用相应的输出插件来做到这一点。但是为了实现这一点,应该在源代码和过滤器标签中使用什么?

fluentd的配置如下:

      <source>
        @type forward
        port 24224
        bind 0.0.0.0
      </source>
        
      <filter *.**="">
        @type parser
        key_name log
        format json
        reserve_data true
      </filter>
        
      <match *.**="">
        @type copy
        <store>
        @type elasticsearch
        host 192.168.1.41
        port 9200
        logstash_format true
        logstash_prefix fluentd
        logstash_dateformat %Y%m%d
        include_tag_key true
        user elastic
        index_name "ap-*"
        password xxxxxxxxx
        type_name access_log
        tag_key @log_name
        flush_interval 1s
        </store>
        <store>
        @type stdout
        </store>
      </match>

使用此配置,流利的工作,但我没有收到任何消息转发到弹性搜索。

这是fluentd的日志

2021-04-08 11:37:25 +0000 [info]: parsing config file is succeeded path="/fluentd/etc/fluent.conf"
2021-04-08 11:37:25 +0000 [info]: gem 'fluent-plugin-elasticsearch' version '5.0.2'
2021-04-08 11:37:25 +0000 [info]: gem 'fluentd' version '1.12.2'
2021-04-08 11:37:25 +0000 [info]: 'flush_interval' is configured at out side of <buffer>. 'flush_mode' is set to 'interval' to keep existing behaviour
2021-04-08 11:37:25 +0000 [info]: using configuration file: <ROOT>
  <source>
    @type forward
    port 24224
    bind "0.0.0.0"
  </source>
  <filter *.**="">
    @type parser
    key_name "log"
    format json
    reserve_data true
    <parse>
      @type json
    </parse>
  </filter>
  <match *.**="">
    @type copy
    <store>
      @type "elasticsearch"
      host "192.168.1.41"
      port 9200
      logstash_format true
      logstash_prefix "fluentd"
      logstash_dateformat "%Y%m%d"
      include_tag_key true
      user "elastic"
      index_name "catalogapi-*"
      password xxxxxx
      type_name "access_log"
      tag_key "@log_name"
      flush_interval 1s
      <buffer>
        flush_interval 1s
      </buffer>
    </store>
    <store>
      @type "stdout"
    </store>
  </match>
</ROOT>
2021-04-08 11:37:25 +0000 [info]: starting fluentd-1.12.2 pid=7 ruby="2.6.6"
2021-04-08 11:37:25 +0000 [info]: spawn command to main:  cmdline=["/usr/local/bin/ruby", "-Eascii-8bit:ascii-8bit", "/usr/local/bundle/bin/fluentd", "-c", "/fluentd/etc/fluent.conf", "-p", "/fluentd/plugins", "--under-supervisor"]
2021-04-08 11:37:26 +0000 [info]: adding filter pattern="*.**=\"\"" type="parser"
2021-04-08 11:37:26 +0000 [info]: adding match pattern="*.**=\"\"" type="copy"
2021-04-08 11:37:26 +0000 [info]: #0 'flush_interval' is configured at out side of <buffer>. 'flush_mode' is set to 'interval' to keep existing behaviour
2021-04-08 11:37:26 +0000 [warn]: #0 Detected ES 7.x: `_doc` will be used as the document `_type`.
2021-04-08 11:37:26 +0000 [info]: adding source type="forward"
2021-04-08 11:37:26 +0000 [info]: #0 starting fluentd worker pid=16 ppid=7 worker=0
2021-04-08 11:37:26 +0000 [info]: #0 listening port port=24224 bind="0.0.0.0"

我做了很多尝试,有各种配置,但没有任何效果。

在此先感谢您的时间。

4

0 回答 0