概述
hadoop官方网站中只提供了32位的hadoop-2.2.0.tar.gz,如果要在64位ubuntu下部署hadoop-2.2.0,就需要重新编译源码包,生成64位的部署包。
建议以下操作使用root账户,避免出现权限不足的问题。
安装jdk
请参考文章《在ubuntu中安装jdk》。
安装maven
请参考文章《在ubuntu中安装maven》。
下载hadoop源码
1
|
wget http:
//mirror
.bit.edu.cn
/apache/hadoop/common/hadoop-2
.2.0
/hadoop-2
.2.0-src.
tar
.gz
|
解压
1
|
tar
-xzf hadoop-2.2.0-src.
tar
.gz
|
编译源代码
1
2
|
cd
hadoop-2.2.0-src
mvn package -Pdist,native -DskipTests -Dtar
|
第1次编译:失败(hadoop pom.xml的bug)
错误信息:
1
2
3
4
5
6
7
8
9
10
|
[ERROR] Failed to execute goal on project hadoop-auth: Could not resolve dependencies
for
project org.apache.hadoop:hadoop-auth:jar:2.2.0: Could not transfer artifact org.mortbay.jetty:jetty:jar:6.1.26 from
/to
central (https:
//repo
.maven.apache.org
/maven2
): GET request of: org
/mortbay/jetty/jetty/6
.1.26
/jetty-6
.1.26.jar from central failed: SSL peer shut down incorrectly -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
[ERROR]
[ERROR] For
more
information about the errors and possible solutions, please
read
the following articles:
[ERROR] [Help 1] http:
//cwiki
.apache.org
/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR] mvn -rf :hadoop-auth
|
解决办法:
这是hadoop的一个bug,在pom.xml中添加下面patch即可,详见https://issues.apache.org/jira/browse/HADOOP-10110 。
编辑`hadoop-common-project/hadoop-auth/pom.xml`文件:
1
|
vi hadoop-common-project/hadoop-auth/pom.xml
|
在<dependencys></dependencys>
节点中插入:
1
2
3
4
5
|
<
dependency
>
<
groupId
>org.mortbay.jetty</
groupId
>
<
artifactId
>jetty-util</
artifactId
>
<
scope
>test</
scope
>
</
dependency
>
|
第2次编译:失败(未安装protoc)
错误信息:
1
2
3
4
5
6
7
8
9
10
|
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.2.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException:
'protoc --version'
did not
return
a version -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
[ERROR]
[ERROR] For
more
information about the errors and possible solutions, please
read
the following articles:
[ERROR] [Help 1] http:
//cwiki
.apache.org
/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR] mvn -rf :hadoop-common
|
解决办法:
根据错误信息可以知道是因为没有安装protoc。
1
2
3
4
5
6
7
|
wget https:
//protobuf
.googlecode.com
/files/protobuf-2
.5.0.
tar
.gz
tar
-xzf protobuf-2.5.0.
tar
.gz
cd
protobuf-2.5.0
.
/configure
make
make
check
make
install
|
其中,在执行./configure
命令是会报如下错误:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
|
checking whether to
enable
maintainer-specific portions of Makefiles...
yes
checking build system
type
... x86_64-unknown-linux-gnu
checking host system
type
... x86_64-unknown-linux-gnu
checking target system
type
... x86_64-unknown-linux-gnu
checking
for
a BSD-compatible
install
...
/usr/bin/install
-c
checking whether build environment is sane...
yes
checking
for
a thread-safe
mkdir
-p...
/bin/mkdir
-p
checking
for
gawk
...
gawk
checking whether
make
sets $(MAKE)... no
checking
for
gcc
... no
checking
for
cc... no
checking
for
cl.exe... no
configure: error:
in
`
/home/hadoop/protobuf-2
.5.0':
configure: error: no acceptable C compiler found
in
$PATH
See `config.log'
for
more
details
|
提示我们找不到C编译器,因此我们还需要安装C编译器。
ubuntu提供了集成gcc等编译器的基本编译工具`build-essential`,安装起来也比较方便,只需要一行命令:
1
|
apt-get
install
build-essential
|
安装过程中可能会提示包找不到,建议先更新下软件源:
1
|
apt-get update
|
安装之后验证protobuf的时候可能会报错以下错误:
1
2
|
$ protoc --version
protoc: error
while
loading shared libraries: libprotoc.so.8: cannot
open
shared object
file
: No such
file
or directory
|
解决如下:
1
2
3
|
$
export
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:
/usr/local/lib
$ protoc --version
libprotoc 2.5.0
|
第3次编译:失败(未安装cmake)
错误信息:
1
2
3
4
5
6
7
8
9
10
|
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (
make
) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program
"cmake"
(
in
directory
"/home/hadoop/hadoop-2.2.0-src/hadoop-common-project/hadoop-common/target/native"
): error=2, No such
file
or directory -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
[ERROR]
[ERROR] For
more
information about the errors and possible solutions, please
read
the following articles:
[ERROR] [Help 1] http:
//cwiki
.apache.org
/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR] mvn -rf :hadoop-common
|
解决办法:
1
|
apt-get
install
cmake
|
第4次编译:失败(未安装libglib2.0-dev)
错误信息:
1
2
3
4
5
6
7
8
9
10
|
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (
make
) on project hadoop-common: An Ant BuildException has occured:
exec
returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
[ERROR]
[ERROR] For
more
information about the errors and possible solutions, please
read
the following articles:
[ERROR] [Help 1] http:
//cwiki
.apache.org
/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR] mvn -rf :hadoop-common
|
解决办法:
1
|
apt-get
install
libglib2.0-dev
|
第5次编译:失败(未安装libssl-dev)
错误信息:
1
2
3
4
5
6
7
8
9
10
|
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (
make
) on project hadoop-pipes: An Ant BuildException has occured:
exec
returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to
enable
full debug logging.
[ERROR]
[ERROR] For
more
information about the errors and possible solutions, please
read
the following articles:
[ERROR] [Help 1] http:
//cwiki
.apache.org
/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR] mvn -rf :hadoop-pipes
|
解决办法:
1
|
apt-get
install
libssl-dev
|
第6次编译:成功
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
|
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 13.578 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 5.183 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 9.527 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 1.268 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 4.717 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 9.966 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 7.368 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.971 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:27 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 14.996 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.078 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:32 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 30.260 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 19.083 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 8.313 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.071 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.542 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:07 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 48.948 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.314 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 18.413 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 23.891 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 5.687 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.345 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 0.721 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 8.261 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.168 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 6.632 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.261 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 40.147 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 3.497 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.164 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 6.054 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 29.892 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 5.450 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 18.558 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 9.045 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.740 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.819 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 12.523 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 5.321 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 8.999 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 13.044 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.739 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 11.307 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 8.223 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 6.296 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 6.341 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 14.662 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 2.694 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.063 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 44.996 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 16.908 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 5.014 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total
time
: 15:23 min
[INFO] Finished at: 2014-10-04T14:54:28+08:00
[INFO] Final Memory: 69M
/215M
[INFO] ------------------------------------------------------------------------
|
编译成果
编译生产的文件在`~/hadoop-2.2.0-src/hadoop-dist/target`目录中。
1
2
3
4
|
$
ls
~
/hadoop-2
.2.0-src
/hadoop-dist/target
antrun hadoop-2.2.0 hadoop-dist-2.2.0-javadoc.jar
test
-
dir
dist-layout-stitching.sh hadoop-2.2.0.
tar
.gz javadoc-bundle-options
dist-
tar
-stitching.sh hadoop-dist-2.2.0.jar maven-archiver
|
其中hadoop-2.2.0是编译后的文件夹,hadoop-2.2.0.tar.gz是编译后的打包文件。
验证
1
2
3
|
$
cd
~
/hadoop-2
.2.0-src
/hadoop-dist/target/hadoop-2
.2.0
/lib/native/
$
file
libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=fb43b4ebd092ae8b4a427719b8907e6fdb223ed9, not stripped
|
可以看到,libhadoop.so.1.0.0已经是64位的了。
拷贝
将编译好的64位hadoop-2.2.0.tar.gz部署包,拷贝到当前用户目录。
1
|
cp
~
/hadoop-2
.2.0-src
/hadoop-dist/target/hadoop-2
.2.0.
tar
.gz ~
|
最后
以上就是欢喜母鸡为你收集整理的64位ubuntu下重新编译hadoop流水账安装jdk安装maven下载hadoop源码解压编译源代码编译成果的全部内容,希望文章能够帮你解决64位ubuntu下重新编译hadoop流水账安装jdk安装maven下载hadoop源码解压编译源代码编译成果所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复