我是靠谱客的博主 羞涩黄豆,这篇文章主要介绍spark-sql 查询报错:Invalid method name: ‘get_table_req‘,现在分享给大家,希望可以做个参考。

复制代码
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
spark-sql> select * from zps_d001 limit 1; Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_xxx. Invalid method name: 'get_table_req' org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_xxx. Invalid method name: 'get_table_req' at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:112) ........................(此处省略若干日志).................. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_d001. Invalid method name: 'get_table_req' at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1282) at org.apache.spark.sql.hive.client.HiveClientImpl.getRawTableOption(HiveClientImpl.scala:392) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$tableExists$1(HiveClientImpl.scala:406) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:291) at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:224) at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:223) at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273) at org.apache.spark.sql.hive.client.HiveClientImpl.tableExists(HiveClientImpl.scala:406) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$tableExists$1(HiveExternalCatalog.scala:854) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102) ... 111 more

根本原因还是spark 与hive的版本不兼容造成的。
安装的spark版本内置的是2.3.7版本的hive,而我集群中的hive是2.1.1版本的。

解决办法:

复制代码
1
2
3
bin/spark-sql --conf spark.sql.hive.metastore.version=2.1.1 --conf spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/lib/*

或者把这两个配置写到 spark/conf/spark-defaults.conf 文件里。

最后

以上就是羞涩黄豆最近收集整理的关于spark-sql 查询报错:Invalid method name: ‘get_table_req‘的全部内容,更多相关spark-sql内容请搜索靠谱客的其他文章。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(96)

评论列表共有 0 条评论

立即
投稿
返回
顶部