概述
业务场景
做数据分析的时候,经常会用到hive -e "sql" > xxx.txt
或者最原始的hive命令行来获得查询结果,然后再将查询结果放到Excel
等工具中,但是如果查询的字段太多,这时候将查询结果放到Excel
会经常会碰到错位问题,很是头疼.
解决方案一:借助linux管道替换输出分隔符
样例如下:
# 方法一:sed
hive -e "select * from pms.pms_algorithm_desc" | sed 's/t/,/g' > ./aaa.txt
# 方法二:tr
hive -e "select * from pms.pms_tp_config" | tr "t" ","
- 1
- 2
- 3
- 4
- 5
结果如下:
$ cat aaa.txt
id,algorithm_id,algorithm_name,algorithm_desc,is_delete,update_time
1,0,默认,,0,2015-11-02 18:14:25.0
2,1,相关,相关分类或者买了还买,0,2015-11-02 18:14:25.0
3,2,相似,,0,2015-11-02 18:14:25.0
4,3,购物车商品为空时类目热销,,0,2015-11-02 18:14:25.0
5,4,热销补余(销量,GMV),,0,2015-11-02 18:14:25.0
6,5,指定类目选品,APP首页价比JD低补余逻辑中指定CE类目选品,0,2015-11-02 18:14:25.0
解决方案二:借助Hive的insert
语法
样例如下:
insert overwrite local directory '/home/pms/workspace/ouyangyewei/data/bi_lost'
row format delimited
fields terminated by ','
select xxxx
from xxxx;
- 1
- 2
- 3
- 4
- 5
上面的sql
将会把查询结果写到/home/pms/workspace/ouyangyewei/data/bi_lost_add_cart
目录中,字段之间以,
分隔
结果如下:
$ ls ~/workspace/ouyangyewei/data/bi_lost
000000_0
$ cat ~/workspace/ouyangyewei/data/bi_lost/000000_0
125171836,11565,6225443584836
Apache Hive官网上的介绍如下:
Standard syntax:
INSERT OVERWRITE [LOCAL] DIRECTORY directory1
[ROW FORMAT row_format] [STORED AS file_format] (Note: Only available starting with Hive 0.11.0)
SELECT ... FROM ...
Hive extension (multiple inserts):
FROM from_statement
INSERT OVERWRITE [LOCAL] DIRECTORY directory1 select_statement1
[INSERT OVERWRITE [LOCAL] DIRECTORY directory2 select_statement2] ...
row_format
: DELIMITED [FIELDS TERMINATED BY char [ESCAPED BY char]] [COLLECTION ITEMS TERMINATED BY char]
[MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]
[NULL DEFINED AS char] (Note: Only available starting with Hive 0.13)
最后
以上就是标致发卡为你收集整理的[Hive]Hive指定查询输出分隔符 的全部内容,希望文章能够帮你解决[Hive]Hive指定查询输出分隔符 所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
发表评论 取消回复