java17省流版
新开发环境记得设置language level:

对应的pom中依赖版本需要同步升级:
<properties> <maven.compiler.release>17</maven.compiler.release> <flink.version>1.19.0</flink.version> <logback.version>1.4.14</logback.version> <junit.version>5.10.2</junit.version> <assertj.version>3.25.3</assertj.version> <testcontainers.version>1.19.7</testcontainers.version> <mockito.version>5.11.0</mockito.version> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <dependencies> <!-- Flink 核心 --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka</artifactId> <version>${flink.version}</version> </dependency> <!-- 日志:官方默认 logback --> <dependency> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> <version>${logback.version}</version> </dependency> <!-- 测试全家桶 --> <dependency> <groupId>org.junit.jupiter</groupId> <artifactId>junit-jupiter</artifactId> <version>${junit.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.assertj</groupId> <artifactId>assertj-core</artifactId> <version>${assertj.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.testcontainers</groupId> <artifactId>kafka</artifactId> <version>${testcontainers.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.mockito</groupId> <artifactId>mockito-core</artifactId> <version>${mockito.version}</version> <scope>test</scope> </dependency> </dependencies> <build> <plugins> <!-- 编译 17 + 预览特性 --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.12.0</version> <configuration> <release>17</release> <compilerArgs> <arg>--enable-preview</arg> </compilerArgs> </configuration> </plugin> <!-- 测试运行 --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>3.2.5</version> <configuration> <argLine>--enable-preview</argLine> </configuration> </plugin> <!-- 打包 fat-jar --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.5.1</version> <executions> <execution> <phase>package</phase> <goals><goal>shade</goal></goals> <configuration> <transformers> <!-- 解决 SPI 合并 --> <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/> <!-- 解决 Manifest --> <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> <mainClass>com.yourcompany.JobMain</mainClass> </transformer> </transformers> </configuration> </execution> </executions> </plugin> </plugins> </build>
文本块:Text Blocks
尤其对SQL:
String ddl = """
CREATE TABLE kafka_source (
user_id STRING,
amount DOUBLE,
ts TIMESTAMP(3),
WATERMARK FOR ts AS ts - INTERVAL '5' SECOND
) WITH (
'connector' = 'kafka',
'topic' = 'user_log',
'properties.bootstrap.servers' = '%s',
'format' = 'json'
)
""".formatted("localhost:9092");
tableEnv.executeSql(ddl);
JSON:
String payload = """
{
"jobName": "%s",
"vertices": [
{
"id": "%s",
"parallelism": %d
}
]
}
""".formatted(jobName, vertexId, parallelism);
记录类record
public record UserCategory(String category,int score) {
}
- 私有 final 字段:
name和age - 公共构造函数:
Person(String name, int age) - 访问器方法(Getter):
name()和age()(注意:不是getName(),直接用字段名) equals()和hashCode():基于所有组件字段的值进行比较toString():格式为Person[name=Alice, age=25]
在flink中的使用:
// 定义一个 record 作为数据模型
public record User(Long id, String name, String email, LocalDateTime createdAt) {}
// 在 Flink 程序中使用
DataStream<User> users = env.fromElements(
new User(1L, "Alice", "alice@example.com", LocalDateTime.now()),
new User(2L, "Bob", "bob@example.com", LocalDateTime.now())
);
// Flink1.18 可以自动序列化/反序列化这个 record(基于反射或 Kryo)
users
.keyBy(User::id) // 按 id 分组
.window(TumblingProcessingTimeWindows.of(Time.seconds(5)))
.sum("id") // 可以使用字段名,Flink 会通过反射找到对应的访问器方法
.print();
switch:
可以灵活通用的使用switch了!支持 enum | String | int 等
public static void main(String[] args) {
String connector = "kakfa";
int code = switch (connector){
case "kafka" -> 1;
case "jdbc" -> 2;
default -> 3;
};
System.out.println(code);
}
toList代替collect:(注意返回的是不可变的List)
List<String> a = List.of("a", "b", "c");
List<String> list = a.stream().map(String::toUpperCase).toList();
密封类:sealed class / interface
public sealed interface Event
permits OrderEvent, PaymentEvent, LoginEvent {
long timestamp();
}
模式匹配:
老写法:
@Override
public void processElement(Event e, Context ctx, Collector<String> out) {
if (e instanceof OrderEvent) {
OrderEvent oe = (OrderEvent) e;
out.collect("order:" + oe.userId());
} else if (e instanceof PaymentEvent) {
PaymentEvent pe = (PaymentEvent) e;
out.collect("payment:" + pe.payId());
}
}
新写法:
@Override
public void processElement(Event e, Context ctx, Collector<String> out) {
if (e instanceof OrderEvent(var userId, var amount)) {
out.collect("order:" + userId);
} else if (e instanceof PaymentEvent(var payId, var amt)) {
out.collect("payment:" + payId);
}
}

浙公网安备 33010602011771号