FLINK基础(139):DS流与表转换(5) Handling of (Insert-Only) Streams(4)toDataStream
2021/8/29 23:36:31
本文主要是介绍FLINK基础(139):DS流与表转换(5) Handling of (Insert-Only) Streams(4)toDataStream,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
The following code shows how to use toDataStream
for different scenarios.
import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.table.api.DataTypes; import org.apache.flink.table.api.Table; import org.apache.flink.types.Row; import java.time.Instant; // POJO with mutable fields // since no fully assigning constructor is defined, the field order // is alphabetical [event_time, name, score] public static class User { public String name; public Integer score; public Instant event_time; } tableEnv.executeSql( "CREATE TABLE GeneratedTable " + "(" + " name STRING," + " score INT," + " event_time TIMESTAMP_LTZ(3)," + " WATERMARK FOR event_time AS event_time - INTERVAL '10' SECOND" + ")" + "WITH ('connector'='datagen')"); Table table = tableEnv.from("GeneratedTable"); // === EXAMPLE 1 === // use the default conversion to instances of Row // since `event_time` is a single rowtime attribute, it is inserted into the DataStream // metadata and watermarks are propagated DataStream<Row> dataStream = tableEnv.toDataStream(table); // === EXAMPLE 2 === // a data type is extracted from class `User`, // the planner reorders fields and inserts implicit casts where possible to convert internal // data structures to the desired structured type // since `event_time` is a single rowtime attribute, it is inserted into the DataStream // metadata and watermarks are propagated DataStream<User> dataStream = tableEnv.toDataStream(table, User.class); // data types can be extracted reflectively as above or explicitly defined DataStream<User> dataStream = tableEnv.toDataStream( table, DataTypes.STRUCTURED( User.class, DataTypes.FIELD("name", DataTypes.STRING()), DataTypes.FIELD("score", DataTypes.INT()), DataTypes.FIELD("event_time", DataTypes.TIMESTAMP_LTZ(3))));
Note that only non-updating tables are supported by toDataStream
. Usually, time-based operations such as windows, interval joins, or the MATCH_RECOGNIZE
clause are a good fit for insert-only pipelines next to simple operations like projections and filters.
这篇关于FLINK基础(139):DS流与表转换(5) Handling of (Insert-Only) Streams(4)toDataStream的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2025-01-12百万架构师第十五课:源码分析:Spring 源码分析:SpringMVC核心原理及源码分析|JavaGuide
- 2025-01-11有哪些好用的家政团队管理工具?
- 2025-01-11营销人必看的GTM五个指标
- 2025-01-11办公软件在直播电商前期筹划中的应用与推荐
- 2025-01-11提升组织效率:上级管理者如何优化跨部门任务分配
- 2025-01-11酒店精细化运营背后的协同工具支持
- 2025-01-11跨境电商选品全攻略:工具使用、市场数据与选品策略
- 2025-01-11数据驱动酒店管理:在线工具的核心价值解析
- 2025-01-11cursor试用出现:Too many free trial accounts used on this machine 的解决方法
- 2025-01-11百万架构师第十四课:源码分析:Spring 源码分析:深入分析IOC那些鲜为人知的细节|JavaGuide