Kafka Ecosystem(Kafka生态)

http://kafka.apache.org/documentation/#ecosystem

https://cwiki.apache.org/confluence/display/KAFKA/Ecosystem


转至元数据结尾

 

转至元数据起始

 

Here is a list of tools we have been told about that integrate with Kafka outside the main distribution. We haven't tried them all, so they may not work!

Clients, of course, are listed separately here.

Kafka Connect

Kafka has a built-in framework called Kafka Connect for writing sources and sinks that either continuously ingest data into Kafka or continuously ingest data in Kafka into external systems. The connectors themselves for different applications or data systems are federated and maintained separately from the main code base. You can find a list of available connectors at the Kafka Connect Hub.

Distributions & Packaging

Stream Processing

Hadoop Integration

  • Confluent HDFS Connector - A sink connector for the Kafka Connect framework for writing data from Kafka to Hadoop HDFS
  • Camus - LinkedIn's Kafka=>HDFS pipeline. This one is used for all data at LinkedIn, and works great.
  • Kafka Hadoop Loader A different take on Hadoop loading functionality from what is included in the main distribution.
  • Flume - Contains Kafka source (consumer) and sink (producer)
  • KaBoom - A high-performance HDFS data loader

Database Integration

Search and Query

  • ElasticSearch - This project, Kafka Standalone Consumer will read the messages from Kafka, processes and index them in ElasticSearch. There are also several Kafka Connect connectors for ElasticSeach.
  • Presto - The Presto Kafka connector allows you to query Kafka in SQL using Presto.
  • Hive - Hive SerDe that allows querying Kafka (Avro only for now) using Hive SQL

Management Consoles

  • Kafka Manager - A tool for managing Apache Kafka.
  • kafkat - Simplified command-line administration for Kafka brokers.
  • Kafka Web Console - Displays information about your Kafka cluster including which nodes are up and what topics they host data for.
  • Kafka Offset Monitor - Displays the state of all consumers and how far behind the head of the stream they are.
  • Capillary – Displays the state and deltas of Kafka-based Apache Storm topologies. Supports Kafka >= 0.8. It also provides an API for fetching this information for monitoring purposes.
  • Doctor Kafka - Service for cluster auto healing and workload balancing.
  • Cruise Control - Fully automate the dynamic workload rebalance and self-healing of a Kafka cluster.
  • Burrow - Monitoring companion that provides consumer lag checking as a service without the need for specifying thresholds.
  • Chaperone - An audit system that monitors the completeness and latency of data stream.

AWS Integration

Logging

Flume - Kafka plugins

Metrics

Packing and Deployment

Kafka Camel Integration

Misc.

posted @ 2019-01-23 10:07  大数据从业者FelixZh  阅读(1188)  评论(0编辑  收藏  举报