0

我有以下用于 logstash 的配置文件:

input {
    file {
        path => "/home/elk/data/visits.csv"
        start_position => "beginning"
        sincedb_path => "NUL"
    }
}

filter {
    csv {
        separator => ","
        columns => ["estado","tiempo_demora","poblacion","id_poblacion","edad_valor","cp","latitude_corregida","longitud_corregida","patologia","Fecha","id_tipo","id_personal","nasistencias","menor","Geopoint_corregido"]
    }
    date {
        match => ["Fecha","dd-MM-YYYY HH:mm"]
        target => "Fecha"
    }
    mutate {convert => ["nasistencias", "integer"]}
    mutate {convert => ["id_poblacion", "integer"]}
    mutate {convert => ["id_personal", "integer"]}
    mutate {convert => ["id_tipo", "integer"]}
    mutate {convert => ["cp", "integer"]}
    mutate {convert => ["edad_valor", "integer"]}
    mutate {
        convert => { "longitud_corregida" => "float" }
        convert => { "latitude_corregida" => "float" }
    }
    mutate {
      rename => {
          "longitud_corregida" => "[location][lon]"
          "latitude_corregida" => "[location][lat]"
      }
    }
}

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "medicalvisits-%{+dd.MM.YYYY}"
    }
    stdout {
        codec => json_lines
        codec => rubydebug
    }
}

从那里Fecha应该已经发送到elasticsearch as date,但是在kibana中,当我尝试将其设置为时间戳时,它不会出现,而是显示为字符串:

在此处输入图像描述

知道我在这里做错了什么吗?

4

1 回答 1

0

索引模式中的类型与索引模板中的类型(实际存储的信息)不同。

我建议你,你应该用你的logstash发送的信息覆盖时间戳。毕竟,在大多数情况下,对您而言重要的是事件的时间戳,而不是事件发送到您的 elasticsearch 时间的时间戳。

话虽如此,为什么您不通过logstash中的“日期”过滤器将“Fecha”直接保存到“@timestamp”中。像这样:

    date {
    match => ["Fecha","dd-MM-YYYY HH:mm"]
    target => "@timestamp"
    tag_on_failure => ["fallo_filtro_fecha"]

如果您确实需要带有@timestamp 的“Fecha”algonside(不是最好的主意),并且 Fecha 的类型为“date”,则另一个选择是修改索引映射以将该字段类型更改为 date。像这样(根据需要调整):

PUT /nombre_de_tu_indice/_mapping
{
  "properties": {
    "Fecha": {
      "type": "date",
    }
  }
}

当然,这种变化只会影响新的索引索引或重新索引的索引。

于 2020-01-30T18:28:54.600 回答