0

有没有更好的解决方案实现将 aws cloudtrail 日志传输到 kibana,这里我使用的是 AWS 的 ElasticSearch 服务

4

2 回答 2

1

这是我在 1.4.2 中使用的 logstash 输入。它工作得很好,虽然我怀疑它很吵(它需要很多 S3 GET/HEAD/LIST 请求)。

input {
  s3 {
    bucket => "bucketname"
    delete => false
    interval => 60 # seconds
    prefix => "cloudtrail/"
    type => "cloudtrail"
    codec => "cloudtrail"
    credentials => "/etc/logstash/s3_credentials.ini"
    sincedb_path => "/opt/logstash_cloudtrail/sincedb"
  }
}

filter {
  if [type] == "cloudtrail" {
    mutate {
      gsub => [ "eventSource", "\.amazonaws\.com$", "" ]
      add_field => {
        "document_id" => "%{eventID}"
      }
    }
    if ! [ingest_time] {
      ruby {
        code => "event['ingest_time'] = Time.now.utc.strftime '%FT%TZ'"
      }
    }
    ruby {
      code => "event.cancel if (Time.now.to_f - event['@timestamp'].to_f) > (60 * 60 * 24 * 1)"
    }
    ruby { 
      code => "event['ingest_delay_hours'] = (Time.now.to_f - event['@timestamp'].to_f) / 3600" 
    }

    # drop events more than a day old, we're probably catching up very poorly
    if [ingest_delay_hours] > 24 {
      drop {}
    }

    # example of an event that is noisy and I don't care about
    if [eventSource] == "elasticloadbalancing" and [eventName] == "describeInstanceHealth" and [userIdentity.userName] == "deploy-s3" {
      drop {}
    }
  }
}

credentials.ini 格式在s3 输入页面上进行了说明;就是这样:

AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=

我也有一个搜索将结果发送到我们的#chatops,但我没有在这里发布。

于 2015-11-21T20:13:46.653 回答
1

如果您还没有尝试过,您可以同时使用 cloudtrail 和 cloudwatch 日志。然后使用 cloudwatch 日志创建订阅,将 cloudtrail 数据发送到 elasticsearch。

完成后,您应该能够定义一个以 cwl* 开头的基于时间的 kibana 索引。

干杯-

于 2015-12-20T03:26:11.893 回答