Sink到ES

       Flink 执行流分析作业,作业摄取数据流,应用转换来分析、转换和建模动态数据,并将其结果写入 Elasticsearch 索引。Kibana 连接到索引并查询它以获取要可视化的数据。

public class EsDemo {

    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        DataStreamSource<String> source = env.socketTextStream("192.168.21.128", 8888);
        DataStream<String> filterSource = source.filter(new FilterFunction<String>() {
            @Override
            public boolean filter(String value) throws Exception {
                return !value.contains("hello");
            }
        });

        DataStream<User> transSource = filterSource.map(value -> {
            String[] fields = value.split(",");
            return new User(fields[0], fields[1]);
        });

        List<HttpHost> hosts = new ArrayList();
        hosts.add(new HttpHost("192.168.21.128", 9200, "http"));
        ElasticsearchSink.Builder<User> userBuilder = new ElasticsearchSink.Builder<>(hosts, new ElasticsearchSinkFunction<User>() {
            @Override
            public void process(User user, RuntimeContext runtimeContext, RequestIndexer requestIndexer) {
                Map<String, String> jsonMap = new HashMap<>();
                jsonMap.put("userId", user.getUserId());
                jsonMap.put("name", user.getUsername());
                IndexRequest indexRequest = Requests.indexRequest();
                indexRequest.index("fink-user");
                indexRequest.source(jsonMap);
                requestIndexer.add(indexRequest);
            }
        });

        userBuilder.setBulkFlushMaxActions(1);
        transSource.addSink(userBuilder.build());

        env.execute("flink-es");
    }
}

说明:

  userBuilder.setBulkFlushMaxActions(1);

设置这个参数值为1,表明每当收到任何的信息,就会立即进行处理,而不需要等到收集到一定的事件后再做处理

 

posted on 2022-05-06 16:16  溪水静幽  阅读(66)  评论(0)    收藏  举报