kafka connect 使用

 

connect地址

https://www.confluent.io/hub

 

安装启动

https://docs.confluent.io/current/connect/userguide.html#connect-userguide-distributed-config

 

https://docs.confluent.io/current/connect/managing/install.html#install-connectors

 

 

 

REST操作connect

参考:https://docs.confluent.io/current/connect/references/restapi.html#connect-userguide-rest

 

 

 

 

遇到到的问题

 

1、启动时候日志中一直刷如下日志,

 Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.

解决:

connect相关配置文件中bootstrap.servers=localhost:9092 ,这与kafka server.properties文件中配置的不一致!修改一直即可。

 

2、Couldn't start HdfsSinkConnector due to configuration error.

 

Caused by: org.apache.kafka.common.config.ConfigException: Invalid value  for configuration locale: Locale cannot be empty

 

解决:

意思是要配置一个参数,经查看相关源码以及可配置属性确认为locale ,中国 配置  locale=zh_CN  ;这个参数是文件分区需要使用的;

 

 

Invalid value  for configuration timezone: Timezone cannot be empty

 

解决:timezone=Asia/Shanghai

 

3、JsonConverter with schemas.enable requires "schema" and "payload" fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.

 

解决:

需要遵循一定的格式,参见:https://cwiki.apache.org/confluence/display/KAFKA/KIP-301%3A+Schema+Inferencing+for+JsonConverter, https://github.com/confluentinc/kafka-connect-jdbc/issues/574

从源码中找的范例如下:

{"schema":{"type":"struct","fields":[{"type":"boolean","optional":true,"field":"booleanField"},{"type":"int32","optional":true,"field":"intField"},{"type":"int64","optional":true,"field":"longField"},{"type":"string","optional":false,"field":"stringField"}]},"payload":{"booleanField":"true","intField":88,"longField":32,"stringField":"str"}}

 

4、io.confluent.connect.storage.errors.HiveMetaStoreException: Hive MetaStore exception

 

最后一个Caused by 如下,

 

Caused by: InvalidObjectException(message:default.test_hdfs table not found)

 

解决:

 

posted on 2019-09-12 19:25  mylittlecabin  阅读(721)  评论(0编辑  收藏  举报

导航