|NO.Z.00077|——————————|^^ 编程 ^^|——|Hadoop&实时数仓.V03|---------------------------------------|实时数仓.v03
2022/4/16 17:12:38
本文主要是介绍|NO.Z.00077|——————————|^^ 编程 ^^|——|Hadoop&实时数仓.V03|---------------------------------------|实时数仓.v03,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
[BigDataHadoop:Hadoop&实时数仓.V03] [BigDataHadoop.电商行业实时数仓项目][|章节五|Hadoop|实时数仓|实时数仓:实时数仓$在Hadoop集群环境运行实时数仓程序.V3|]
一、运行OrderStatistics
### --- 复制类的绝对路径 ~~~ # 选中类名:Copy Reference dw.dws.OrderStatistics
### --- 运行类 ~~~ # 运行主类:OrderStatistics [root@hadoop02 ~]# /opt/yanqi/servers/flink-1.11.1/bin/flink run -m yarn-cluster -c dw.dws.OrderStatistics /root/README/myjar/ebProject/EbProject-1.0-SNAPSHOT.jar ~~~输出参数 [] - Submitting application master application_1638525439341_0003 [] - Submitted application application_1638525439341_0003 [] - Waiting for the cluster to be allocated [] - Deploying cluster, current state ACCEPTED [] - YARN application has been deployed successfully. [] - Found Web Interface hadoop03:42209 of application 'application_1638525439341_000 Job has been submitted with JobID b7ed68c9d77f357b7cb75da37951a2f1
### --- 向mysql下插入数据:yanqi_trade_orders ~~~ # 在MySQL中插入数据:yanqi_trade_orders表插入数据 mysql> USE dwshow; mysql> INSERT INTO `yanqi_trade_orders` VALUES ('21', '23a0b124546', '70', '2', '0.12', '20792.00', '1', '0', '370211', '0', '0', '0', '1', '2020-06-28 18:15:02', '2020-06-28 18:15:02', '2020-10-21 22:56:37');二、查看kafka消费数据,hbash下沉数据
### --- kafka消费者消费到数据 ~~~ # kafka中消费到数据 [root@hadoop02 ~]# kafka-console-consumer.sh --zookeeper hadoop02:2181/myKafka --topic canal --from-beginning ~~~等待消费数据 {"data":[{"orderId":"21","orderNo":"23a0b124546","userId":"70","status":"2","productMoney":"0.12","totalMoney":"20792.0","payMethod":"1","isPay":"0","areaId":"370211","tradeSrc":"0","tradeType":"0","isRefund":"0","dataFlag":"1","createTime":"2020-06-28 18:15:02","payTime":"2020-06-28 18:15:02","modifiedTime":"2020-10-21 22:56:37"}],"database":"dwshow","es":1638117602000,"id":38,"isDdl":false,"mysqlType":{"orderId":"bigint(11)","orderNo":"varchar(20)","userId":"bigint(11)","status":"tinyint(4)","productMoney":"decimal(11,2)","totalMoney":"decimal(11,2)","payMethod":"tinyint(4)","isPay":"tinyint(4)","areaId":"int(11)","tradeSrc":"tinyint(4)","tradeType":"int(11)","isRefund":"tinyint(4)","dataFlag":"tinyint(4)","createTime":"varchar(25)","payTime":"varchar(25)","modifiedTime":"timestamp"},"old":null,"pkNames":["orderId"],"sql":"","sqlType":{"orderId":-5,"orderNo":12,"userId":-5,"status":-6,"productMoney":3,"totalMoney":3,"payMethod":-6,"isPay":-6,"areaId":4,"tradeSrc":-6,"tradeType":4,"isRefund":-6,"dataFlag":-6,"createTime":12,"payTime":12,"modifiedTime":93},"table":"yanqi_trade_orders","ts":1638117602172,"type":"INSERT"}
### --- 查看hbash下:yanqi_trade_orders下沉数据 ~~~ # hbash表中获取到数据 [root@hadoop02 ~]# /opt/yanqi/servers/hbase-1.3.1/bin/hbase shell hbase(main):017:0> scan 'yanqi_trade_orders' ROW COLUMN+CELL 21 column=f1:areaId, timestamp=1638117612278, value=370211 21 column=f1:createTime, timestamp=1638117612278, value=2020-06-28 18:15:02 21 column=f1:dataFlag, timestamp=1638117612278, value=1 21 column=f1:isPay, timestamp=1638117612278, value=0 21 column=f1:isRefund, timestamp=1638117612278, value=0 21 column=f1:modifiedTime, timestamp=1638117612278, value=2020-10-21 22:56:37 21 column=f1:orderId, timestamp=1638117612278, value=21 21 column=f1:orderNo, timestamp=1638117612278, value=23a0b124546 21 column=f1:payMethod, timestamp=1638117612278, value=1 21 column=f1:payTime, timestamp=1638117612278, value=2020-06-28 18:15:02 21 column=f1:productMoney, timestamp=1638117612278, value=0.12 21 column=f1:status, timestamp=1638117612278, value=2 21 column=f1:totalMoney, timestamp=1638117612278, value=20792.0 21 column=f1:tradeSrc, timestamp=1638117612278, value=0 21 column=f1:tradeType, timestamp=1638117612278, value=0 21 column=f1:userId, timestamp=1638117612278, value=70三、查看数据可视化:监控到的数据变化:查看grafana监控到flink运行指标
查看redis下是否获取到数据
查看prometheus下是否监控到数据
===============================END===============================
Walter Savage Landor:strove with none,for none was worth my strife.Nature I loved and, next to Nature, Art:I warm'd both hands before the fire of life.It sinks, and I am ready to depart ——W.S.Landor
来自为知笔记(Wiz)
这篇关于|NO.Z.00077|——————————|^^ 编程 ^^|——|Hadoop&实时数仓.V03|---------------------------------------|实时数仓.v03的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2023-05-13Windows下hadoop环境搭建之NameNode启动报错
- 2023-04-14hadoop伪分布式集群的安装(不是单机版)
- 2022-12-05Hadoop生态系统—数据仓库Hive的安装
- 2022-11-02Win10搭建Hadoop环境
- 2022-10-19Hadoop生态系统(数据仓库Hive的安装)
- 2022-10-03Hadoop、storm和Spark Streaming简单介绍
- 2022-10-03胖虎的Hadoop笔记——Hadoop的伪分布式部署
- 2022-09-11Ubuntu搭建全分布式Hadoop
- 2022-09-11Ubuntu搭建全分布式Hadoop
- 2022-09-09Ubuntu下安装伪分布式HADOOP遇到的一些问题