Flink(四)Flink开发IDEA环境搭建与测试(1)IDEA开发环境

Posted on 2020-08-30 13:30  MissRong  阅读(1182)  评论(0)    收藏  举报

 Flink(四)Flink开发IDEA环境搭建与测试(1)IDEA开发环境

先新建一个项目(这里面会用Scala、Java写Flink):

因为要加载Scala,所以创建项目参考:

https://www.cnblogs.com/liuxinrong/articles/12972537.html

https://www.cnblogs.com/liuxinrong/articles/12969548.html

 

1pom文件设置

<properties>

        <maven.compiler.source>1.8</maven.compiler.source>

        <maven.compiler.target>1.8</maven.compiler.target>

        <encoding>UTF-8</encoding>

        <scala.version>2.11.12</scala.version>

        <scala.binary.version>2.11</scala.binary.version>

        <hadoop.version>2.8.4</hadoop.version>

        <flink.version>1.6.1</flink.version>

    </properties>

    <dependencies>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-java</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-table_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-clients_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-client</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>mysql</groupId>

            <artifactId>mysql-connector-java</artifactId>

            <version>5.1.38</version>

        </dependency>

        <dependency>

            <groupId>com.alibaba</groupId>

            <artifactId>fastjson</artifactId>

            <version>1.2.22</version>

        </dependency>

    </dependencies>

    <build>

        <sourceDirectory>src/main/scala</sourceDirectory>

        <testSourceDirectory>src/test/scala</testSourceDirectory>

        <plugins>

            <plugin>

                <groupId>net.alchim31.maven</groupId>

                <artifactId>scala-maven-plugin</artifactId>

                <version>3.2.0</version>

                <executions>

                    <execution>

                        <goals>

                            <goal>compile</goal>

                            <goal>testCompile</goal>

                        </goals>

                        <configuration>

                            <args>

                                <!-- <arg>-make:transitive</arg> -->

                                <arg>-dependencyfile</arg>

                                <arg>${project.build.directory}/.scala_dependencies</arg>

                            </args>

                        </configuration>

                    </execution>

                </executions>

            </plugin>

            <plugin>

                <groupId>org.apache.maven.plugins</groupId>

                <artifactId>maven-surefire-plugin</artifactId>

                <version>2.18.1</version>

                <configuration>

                    <useFile>false</useFile>

                    <disableXmlReport>true</disableXmlReport>

                    <includes>

                        <include>**/*Test.*</include>

                        <include>**/*Suite.*</include>

                    </includes>

                </configuration>

            </plugin>

            <plugin>

                <groupId>org.apache.maven.plugins</groupId>

                <artifactId>maven-shade-plugin</artifactId>

                <version>3.0.0</version>

                <executions>

                    <execution>

                        <phase>package</phase>

                        <goals>

                            <goal>shade</goal>

                        </goals>

                        <configuration>

                            <filters>

                                <filter>

                                    <artifact>*:*</artifact>

                                    <excludes>

                                        <exclude>META-INF/*.SF</exclude>

                                        <exclude>META-INF/*.DSA</exclude>

                                        <exclude>META-INF/*.RSA</exclude>

                                    </excludes>

                                </filter>

                            </filters>

                            <transformers>

                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

                                    <mainClass>org.apache.spark.WordCount</mainClass>

                                </transformer>

                            </transformers>

                        </configuration>

                    </execution>

                </executions>

            </plugin>

        </plugins>

    </build>

右下角会出现小标签:Maven projects need to be imported,点击第二个选项:Enable Auto-Import(即使不爆红也要按)

2flink开发流程

Flink具有特殊类DataSet并DataStream在程序中表示数据。

您可以将它们视为可以包含重复项的不可变数据集合。

在DataSet数据有限的情况下,对于一个DataStream元素的数量可以是无界的。

这些集合在某些关键方面与常规Java集合不同。首先,它们是不可变的,这意味着一旦创建它们就无法添加或删除元素。你也不能简单地检查里面的元素。

集合最初通过在弗林克程序添加源创建和新的集合从这些通过将它们使用API方法如衍生map,filter等等。

Flink程序看起来像是转换数据集合的常规程序。

每个程序包含相同的基本部分:

1.获取execution environment,

final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

2.加载/创建初始化数据

DataStream<String> text = env.readTextFile("file:///path/to/file");

3.指定此数据的转换

val mapped = input.map { x => x.toInt }

4.指定放置计算结果的位置

writeAsText(String path)

print()

5.触发程序执行

在local模式下执行程序

execute()

将程序打成jar运行在线上

./bin/flink run \

-m node21:8081 \

./examples/batch/WordCount.jar \

--input  hdfs:///user/itstar/input/wc.txt \

--output  hdfs:///user/itstar/output2  \

------ 笔 记 ------

每个Flink程序都包含以下若干流程

1、获取执行环境

2、加载、创建初始数据  source

3、转换数据  transformation

4、放置计算结果位置  sink

5、触发程序运行

1、Environment

执行环境 StreamExecutionEnvironment 是所有flink程序的基础

创建执行环境有三种方式,分别为:

StreamExecutionEnvironment.getExecutionEnvironment 帮助我们判断是在本地还是集群

StreamExecutionEnvironment.createLocalEnvironment 在本地执行

StreamExecutionEnvironment.createRemoteEnvironment 在集群执行

2、加载、创建初始数据

在StreamExecutionEnvironment中,提供了不同数据接入接口

3、转换数据

Flink中的Transformation都是通过不同的Operation来实现。

每个Operation内部通过实现Function接口完成数据处理逻辑定义。

4、输出结果

writeAsText  print

flink提供了大量 connect操作,方便与外部系统交互

5、程序触发

需要调用 ExecutionEnvironment 的 execute 方法来触发。

DataSet Api中,已经包含了 execute

博客园  ©  2004-2025
浙公网安备 33010602011771号 浙ICP备2021040463号-3