spark性能调优 1、spark汇聚失败出错原因,hive默认配置中parquet和动态分区设置太小hiveContext.setConf("parquet.memory.min.chunk.size","100000")hiveContext.setConf("hive.exec.max.dynamic.partions","100000")2.hive数据入hbase报错出现... java 2023-12-07 58 点赞 0 评论 87 浏览