Logstash5.6.1-Kafka插件配置

output {
    elasticsearch {
        hosts => ["host1:9200","host2:9200","host3:9200"]
        index => "logstash-%{type}-%{+YYYY.MM.dd}"
        document_type => "%{type}"
        workers => 1
        flush_size => 10
        idle_flush_time => 10
        template_overwrite => true
    }

kafka {
codec => plain {
format => "%{message}"
}
topic_id => "mytopic"
bootstrap_servers => "kfka1:9092,kafka2:9092"batch_size => "16384"
compression_type => "gzip"
linger_ms => "1000"
reconnect_backoff_ms => "10000"
buffer_memory => "33554432"
key_serializer => "org.apache.kafka.common.serialization.IntegerSerializer"
value_serializer => "org.apache.kafka.common.serialization.StringSerializer"
}
   
}

网上转了一圈,有的配置是很早期的版本,在最新版里面没法用,于是我从官网看了下说明做了个最新的配置

这里基于kafka0.11,client-0.10.2,es5.6.1,logstash5.6.1 已经正常在跑

posted @ 2017-10-27 10:29  yuan.net  阅读(914)  评论(0编辑  收藏  举报