我是靠谱客的博主 欣喜鱼,最近开发中收集的这篇文章主要介绍spark读取hbase报错:java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

原文地址:http://mangocool.com/1437009997261.html

用sbt打包Spark程序,并未将所有依赖都打入包中,把Spark应用放到集群中运行时,出现异常:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
         at SparkHbase$.main(SparkHbase.scala: 34 )
         at SparkHbase.main(SparkHbase.scala)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57 )
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
         at java.lang.reflect.Method.invoke(Method.java: 606 )
         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala: 569 )
         at org.apache.spark.deploy.SparkSubmit$.doRunMain$ 1 (SparkSubmit.scala: 166 )
         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala: 189 )
         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala: 110 )
         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
         at java.net.URLClassLoader$ 1 .run(URLClassLoader.java: 366 )
         at java.net.URLClassLoader$ 1 .run(URLClassLoader.java: 355 )
         at java.security.AccessController.doPrivileged(Native Method)
         at java.net.URLClassLoader.findClass(URLClassLoader.java: 354 )
         at java.lang.ClassLoader.loadClass(ClassLoader.java: 425 )
         at java.lang.ClassLoader.loadClass(ClassLoader.java: 358 )
         ... 11 more

出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:

?
1
export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:
?
1
source spark-env.sh

并重启一下spark服务,就ok了!

其实还有一个方法,就是在你提交应用时增加--driver-class-path配置参数来设置driver的classpath:

?
1
./spark-submit --driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar  --class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar

注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
15 / 08 / 14 09 : 22 : 23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 $$anonfun$apply$ 8 .apply(SparkConf.scala: 444 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 $$anonfun$apply$ 8 .apply(SparkConf.scala: 442 )
         at scala.collection.immutable.List.foreach(List.scala: 318 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 .apply(SparkConf.scala: 442 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 .apply(SparkConf.scala: 430 )
         at scala.Option.foreach(Option.scala: 236 )
         at org.apache.spark.SparkConf.validateSettings(SparkConf.scala: 430 )
         at org.apache.spark.SparkContext.<init>(SparkContext.scala: 365 )
         at com.dtxy.data.SqlTest$.main(SqlTest.scala: 27 )
         at com.dtxy.data.SqlTest.main(SqlTest.scala)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57 )
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
         at java.lang.reflect.Method.invoke(Method.java: 606 )
         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala: 664 )
         at org.apache.spark.deploy.SparkSubmit$.doRunMain$ 1 (SparkSubmit.scala: 169 )
         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala: 192 )
         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala: 111 )
         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15 / 08 / 14 09 : 22 : 23 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 $$anonfun$apply$ 8 .apply(SparkConf.scala: 444 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 $$anonfun$apply$ 8 .apply(SparkConf.scala: 442 )
         at scala.collection.immutable.List.foreach(List.scala: 318 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 .apply(SparkConf.scala: 442 )
         at org.apache.spark.SparkConf$$anonfun$validateSettings$ 6 .apply(SparkConf.scala: 430 )
         at scala.Option.foreach(Option.scala: 236 )
         at org.apache.spark.SparkConf.validateSettings(SparkConf.scala: 430 )
         at org.apache.spark.SparkContext.<init>(SparkContext.scala: 365 )
         at com.dtxy.data.SqlTest$.main(SqlTest.scala: 27 )
         at com.dtxy.data.SqlTest.main(SqlTest.scala)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57 )
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java: 43 )
         at java.lang.reflect.Method.invoke(Method.java: 606 )
         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala: 664 )
         at org.apache.spark.deploy.SparkSubmit$.doRunMain$ 1 (SparkSubmit.scala: 169 )
         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala: 192 )
         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala: 111 )
         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15 / 08 / 14 09 : 22 : 23 INFO Utils: Shutdown hook called

到此为止,问题解决!


最后

以上就是欣喜鱼为你收集整理的spark读取hbase报错:java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration的全部内容,希望文章能够帮你解决spark读取hbase报错:java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(59)

评论列表共有 0 条评论

立即
投稿
返回
顶部