我是靠谱客的博主 花痴乌冬面,最近开发中收集的这篇文章主要介绍hive 同源多表插入操作,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

vim hive-multi_insert.sql

# 设置打印表头
set hive.cli.print.header=true;
# 创建源数据表
create external table if not exists multi_insert_total_tb(
stu_id int comment 'the id of a student',
stu_name string,
stu_grade string,
stu_sex string,
stu_score int,
roll_year string)
row format delimited
fields terminated by ',';
load data local inpath '/var/lib/hadoop-hdfs/hive-app-data/multi_insert_total_tb.dat' overwrite into table multi_insert_total_tb;
select * from multi_insert_total_tb;
# 创建分表1
create external table if not exists multi_insert_tb01(
stu_id int comment 'the id of a student',
stu_name string,
stu_sex string,
stu_addr string)
row format delimited
fields terminated by ',';
# 创建分表2
create external table if not exists multi_insert_tb02(
stu_id int comment 'the id of a student',
stu_grade string,
stu_score int)
row format delimited
fields terminated by ',';
# 创建分表3
create external table if not exists multi_insert_tb03(
stu_id int comment 'the id of a student',
roll_year string)
row format delimited
fields terminated by ',';
# 同源多表插入操作
from multi_insert_total_tb
insert into table multi_insert_tb01 select stu_id,stu_name,stu_sex,null
insert into table multi_insert_tb02 select stu_id,stu_grade,stu_score
insert into table multi_insert_tb03 select stu_id,roll_year;
# 分别查看3个表数据
select * from multi_insert_tb01;
select * from multi_insert_tb02;
select * from multi_insert_tb03;

执行结果

[hdfs@cdh04 hive-app-data]$ hive -f hive-multi_insert.sql
OK
Time taken: 3.008 seconds
Loading data to table default.multi_insert_total_tb
Table default.multi_insert_total_tb stats: [numFiles=1, totalSize=135]
OK
Time taken: 0.832 seconds
OK
multi_insert_total_tb.stu_id
multi_insert_total_tb.stu_name
multi_insert_total_tb.stu_grade multi_insert_total_tb.stu_sex
multi_insert_total_tb.stu_score multi_insert_total_tb.roll_year
92103
xiaoming
B
m
78
2015
92115
xiaohua A
m
82
2005
92122
xiaohong
D
f
58
2013
92133
xiaodu
A
f
98
2003
92117
xiaoli
B
m
74
2017
Time taken: 0.36 seconds, Fetched: 5 row(s)
OK
Time taken: 0.08 seconds
OK
Time taken: 0.157 seconds
OK
Time taken: 0.156 seconds
Query ID = hdfs_20200705124040_106693fc-a4da-47cc-bc6a-aa238444538e
Total jobs = 7
Launching Job 1 out of 7
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1593909553778_0001, Tracking URL = http://cdh01:8088/proxy/application_1593909553778_0001/
Kill Command = /opt/cloudera/parcels/CDH-5.16.2-1.cdh5.16.2.p0.8/lib/hadoop/bin/hadoop job
-kill job_1593909553778_0001
Hadoop job information for Stage-3: number of mappers: 1; number of reducers: 0
2020-07-05 12:40:38,492 Stage-3 map = 0%,
reduce = 0%
2020-07-05 12:40:48,008 Stage-3 map = 100%,
reduce = 0%, Cumulative CPU 2.68 sec
MapReduce Total cumulative CPU time: 2 seconds 680 msec
Ended Job = job_1593909553778_0001
Stage-6 is selected by condition resolver.
Stage-5 is filtered out by condition resolver.
Stage-7 is filtered out by condition resolver.
Stage-12 is selected by condition resolver.
Stage-11 is filtered out by condition resolver.
Stage-13 is filtered out by condition resolver.
Stage-18 is selected by condition resolver.
Stage-17 is filtered out by condition resolver.
Stage-19 is filtered out by condition resolver.
Moving data to: hdfs://cdh01:8020/user/hive/warehouse/multi_insert_tb01/.hive-staging_hive_2020-07-05_12-40-27_042_6368299300154222600-1/-ext-10000
Moving data to: hdfs://cdh01:8020/user/hive/warehouse/multi_insert_tb02/.hive-staging_hive_2020-07-05_12-40-27_042_6368299300154222600-1/-ext-10002
Moving data to: hdfs://cdh01:8020/user/hive/warehouse/multi_insert_tb03/.hive-staging_hive_2020-07-05_12-40-27_042_6368299300154222600-1/-ext-10004
Loading data to table default.multi_insert_tb01
Loading data to table default.multi_insert_tb02
Loading data to table default.multi_insert_tb03
Table default.multi_insert_tb01 stats: [numFiles=1, numRows=5, totalSize=95, rawDataSize=90]
Table default.multi_insert_tb02 stats: [numFiles=1, numRows=5, totalSize=55, rawDataSize=50]
Table default.multi_insert_tb03 stats: [numFiles=1, numRows=5, totalSize=55, rawDataSize=50]
MapReduce Jobs Launched:
Stage-Stage-3: Map: 1
Cumulative CPU: 2.68 sec
HDFS Read: 6566 HDFS Write: 448 SUCCESS
Total MapReduce CPU Time Spent: 2 seconds 680 msec
OK
stu_id
roll_year
Time taken: 23.311 seconds
OK
multi_insert_tb01.stu_id
multi_insert_tb01.stu_name
multi_insert_tb01.stu_sex
multi_insert_tb01.stu_addr
92103
xiaoming
m
NULL
92115
xiaohua m
NULL
92122
xiaohong
f
NULL
92133
xiaodu
f
NULL
92117
xiaoli
m
NULL
Time taken: 0.151 seconds, Fetched: 5 row(s)
OK
multi_insert_tb02.stu_id
multi_insert_tb02.stu_grade
multi_insert_tb02.stu_score
92103
B
78
92115
A
82
92122
D
58
92133
A
98
92117
B
74
Time taken: 0.072 seconds, Fetched: 5 row(s)
OK
multi_insert_tb03.stu_id
multi_insert_tb03.roll_year
92103
2015
92115
2005
92122
2013
92133
2003
92117
2017
Time taken: 0.061 seconds, Fetched: 5 row(s)

 

最后

以上就是花痴乌冬面为你收集整理的hive 同源多表插入操作的全部内容,希望文章能够帮你解决hive 同源多表插入操作所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(52)

评论列表共有 0 条评论

立即
投稿
返回
顶部