1

我有 2 个 docker 容器,1 个运行 Logstash,另一个运行 Zookeeper 和 Kafka。我正在尝试将数据从 Logstash 发送到 Kafka,但似乎无法将数据传递到我在 Kafka 中的主题。

我可以登录到 Docker Kafka 容器并从终端向我的主题生成一条消息,然后也可以使用它。

我正在使用输出 kafka 插件:

output {
    kafka {
        topic_id => "MyTopicName"
        broker_list => "kafkaIPAddress:9092"
    }
}

我从运行中得到的 ipAddressdocker inspect kafka2

当我运行./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf我得到这个错误。

Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}

我通过运行以下返回 OK 的命令来检查文件的配置。

 ./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK

有没有人遇到过这种情况,是不是我没有打开 kafka 容器上的端口,如果是这样,我该如何在保持 Kafka 运行的同时做到这一点?

4

1 回答 1

1

错误在这里broker_list => "kafkaIPAddress:9092"

尝试bootstrap_servers => "KafkaIPAddress:9092" 是否将容器放在不同的机器上,将 kafka 映射到主机9092并使用主机地址:端口,如果在同一主机上使用内部 DockerIP:port

于 2016-08-19T10:04:27.297 回答