我有一个这样的logstash配置:
input {
file {
path => ["/home/csdata/*.data"]
codec => json {
}
start_position => "beginning"
discover_interval => 5
}
}
output{
if [_up] == 1 {
elasticsearch {
protocol => "http"
host => "[myelasticsearchip]"
cluster => "clustername"
flush_size => 50
index => "%{_index}"
action => "update"
document_id => "%{_id}"
index_type => "%{_type}"
}
}
else if [_id] != "" {
elasticsearch {
protocol => "http"
host => "[myelasticsearchip]"
cluster => "clustername"
flush_size => 50
index => "%{_index}"
document_id => "%{_id}"
index_type => "%{_type}"
}
}
else{
elasticsearch {
protocol => "http"
host => "[myelasticsearchip]"
cluster => "clustername"
index => "%{_index}"
flush_size => 50
index_type => "%{_type}"
}
}
}
我有一吨
failed action with response of 404, dropping action:
数据都应该按顺序进入同一个文件,因此应该在更新之前创建内容。这不会发生在所有项目上,但有很多。我希望没有这些错误。
这是因为不同的flush_sizes吗?尽管项目在原始文件中是按顺序排列的,这意味着 INSERT 总是在 UPDATE 之前。
任何想法将不胜感激!