欢迎光临
我们一直在努力

HBASE 0.98版本安装,一步曲:编译HADOOP 2.2.0 x64版本

1、安装JDK

下载JDK 7u55版本,安装
JDK和JRE都需要,JDK里面有tools.jar,这个jar包是一定需要的
安装在/java上

2、下载Hadoop 2.2.0源代码

wget http://apache.dataguru.cn/hadoop/common/stable/hadoop-2.2.0-src.tar.gz
解压缩
tar
zxvf hadoop-2.2.0-src.tar.gz -C /tmp

3、安装依赖包

yum -y install lzo-devel zlib-devel gcc autoconf automake libtool gcc-c++
openssl-devel openssl-devel

4、安装相关编译用软件

Protobuf编译和安装

tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
–prefix=/usr/local/protobuf
make
make install

Ant安装

tar -zxvf apache-ant-1.9.2-bin.tar.gz
mv apache-ant-1.9.2
/usr/local/ant

Maven安装

tar -zxvf apache-maven-3.0.5-bin.tar.gz
mv apache-maven-3.0.5
/usr/local/maven

Findbugs安装

tar -zxfv findbugs-2.0.2.tar.gz
mv findbugs-2.0.2 /usr/local/findbugs

cmake编译安装

tar -zvxf cmake-2.8.8.tar.gz
cd cmake-2.8.8
./bootstrap
make
make
install

5、配置路径

vim /etc/profile

#java
export JAVA_HOME=/java
export JRE_HOME=$JAVA_HOME/jre
export
CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JRE_HOME/lib
export
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin

#maven
export MAVEN_HOME=/usr/local/maven
export MAVEN_OPTS=”-Xms256m
-Xmx512m”
export CLASSPATH=.:$CLASSPATH:$MAVEN_HOME/lib
export
PATH=$PATH:$MAVEN_HOME/bin

#protobuf
export PROTOBUF_HOME=/usr/local/protobuf
export
CLASSPATH=.:$CLASSPATH:$PROTOBUF_HOME/lib
export
PATH=$PATH:$PROTOBUF_HOME/bin

#ant
export ANT_HOME=/usr/local/ant
export
CLASSPATH=.:$CLASSPATH:$ANT_HOME/lib
export PATH=$PATH:$ANT_HOME/bin

#findbugs
export FINDBUGS_HOME=/usr/local/findbugs
export
CLASSPATH=.:$CLASSPATH:$FINDBUGS_HOME/lib
export
PATH=$PATH:$FINDBUGS_HOME/bin

source /etc/profile

即刻生效

9、修改依赖Bug

vim /hadoop-2.2.0/hadoop-common-project/hadoop-auth/pom.xml

在dependency部分加入:

org.mortbay.jetty
jetty
test

org.mortbay.jetty
jetty-util
test

10、编译

cd hadoop-2.2.0-src

mvn clean package -Pdist,native -DskipTests -Dtar

[INFO]
————————————————————————
[INFO]
Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main
………………………….. SUCCESS [10.796s]
[INFO] Apache Hadoop
Project POM ……………………. SUCCESS [8.171s]
[INFO] Apache Hadoop
Annotations ……………………. SUCCESS [18.306s]
[INFO] Apache Hadoop
Assemblies …………………….. SUCCESS [1.704s]
[INFO] Apache Hadoop
Project Dist POM ……………….. SUCCESS [8.222s]
[INFO] Apache Hadoop
Maven Plugins ………………….. SUCCESS [17.120s]
[INFO] Apache Hadoop
Auth ………………………….. SUCCESS [15.952s]
[INFO] Apache Hadoop
Auth Examples ………………….. SUCCESS [12.085s]
[INFO] Apache Hadoop
Common ………………………… SUCCESS [4:57.617s]
[INFO] Apache
Hadoop NFS …………………………… SUCCESS [25.393s]
[INFO] Apache
Hadoop Common Project …………………. SUCCESS [0.231s]
[INFO] Apache
Hadoop HDFS ………………………….. SUCCESS [5:51.635s]
[INFO]
Apache Hadoop HttpFS ………………………… SUCCESS
[1:27.220s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ………….
SUCCESS [59.011s]
[INFO] Apache Hadoop HDFS-NFS ……………………….
SUCCESS [11.979s]
[INFO] Apache Hadoop HDFS Project ……………………
SUCCESS [0.195s]
[INFO] hadoop-yarn …………………………………
SUCCESS [1:41.292s]
[INFO] hadoop-yarn-api
…………………………….. SUCCESS [1:53.028s]
[INFO]
hadoop-yarn-common ………………………….. SUCCESS
[1:47.889s]
[INFO] hadoop-yarn-server …………………………..
SUCCESS [0.712s]
[INFO] hadoop-yarn-server-common …………………….
SUCCESS [38.517s]
[INFO] hadoop-yarn-server-nodemanager ………………..
SUCCESS [53.352s]
[INFO] hadoop-yarn-server-web-proxy ………………….
SUCCESS [13.733s]
[INFO] hadoop-yarn-server-resourcemanager …………….
SUCCESS [49.935s]
[INFO] hadoop-yarn-server-tests ……………………..
SUCCESS [3.230s]
[INFO] hadoop-yarn-client …………………………..
SUCCESS [23.036s]
[INFO] hadoop-yarn-applications ……………………..
SUCCESS [0.690s]
[INFO] hadoop-yarn-applications-distributedshell ………
SUCCESS [7.623s]
[INFO] hadoop-mapreduce-client ………………………
SUCCESS [0.581s]
[INFO] hadoop-mapreduce-client-core ………………….
SUCCESS [1:26.644s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher
…. SUCCESS [8.783s]
[INFO] hadoop-yarn-site
……………………………. SUCCESS [1.217s]
[INFO]
hadoop-yarn-project …………………………. SUCCESS [30.587s]
[INFO]
hadoop-mapreduce-client-common ……………….. SUCCESS
[1:19.185s]
[INFO] hadoop-mapreduce-client-shuffle ……………….
SUCCESS [17.693s]
[INFO] hadoop-mapreduce-client-app …………………..
SUCCESS [41.929s]
[INFO] hadoop-mapreduce-client-hs ……………………
SUCCESS [18.209s]
[INFO] hadoop-mapreduce-client-jobclient ……………..
SUCCESS [24.663s]
[INFO] hadoop-mapreduce-client-hs-plugins …………….
SUCCESS [7.631s]
[INFO] Apache Hadoop MapReduce Examples ………………
SUCCESS [22.663s]
[INFO] hadoop-mapreduce …………………………….
SUCCESS [10.093s]
[INFO] Apache Hadoop MapReduce Streaming ……………..
SUCCESS [19.489s]
[INFO] Apache Hadoop Distributed Copy ………………..
SUCCESS [51.046s]
[INFO] Apache Hadoop Archives ……………………….
SUCCESS [7.621s]
[INFO] Apache Hadoop Rumen ………………………….
SUCCESS [20.543s]
[INFO] Apache Hadoop Gridmix ………………………..
SUCCESS [15.156s]
[INFO] Apache Hadoop Data Join ………………………
SUCCESS [9.968s]
[INFO] Apache Hadoop Extras …………………………
SUCCESS [9.504s]
[INFO] Apache Hadoop Pipes ………………………….
SUCCESS [15.708s]
[INFO] Apache Hadoop Tools Dist ……………………..
SUCCESS [5.261s]
[INFO] Apache Hadoop Tools ………………………….
SUCCESS [0.268s]
[INFO] Apache Hadoop Distribution ……………………
SUCCESS [1:15.418s]
[INFO] Apache Hadoop Client
………………………… SUCCESS [29.025s]
[INFO] Apache Hadoop
Mini-Cluster …………………… SUCCESS [0.735s]
[INFO]
————————————————————————
[INFO]
BUILD SUCCESS
[INFO]
————————————————————————
[INFO]
Total time: 34:15.365s
[INFO] Finished at: Fri May 16 16:15:37 CST
2014
[INFO] Final Memory: 101M/385M
[INFO]
————————————————————————

编译完毕会在

hadoop-2.2.0-src/hadoop-dist/target/

产生一个包

hadoop-2.2.0.tar.gz

这个包就是最终可部署的hadoop包

整个编程过程可能会出现中间失败的情况,有很多原因,有可能是因为连接主机下载依赖不成功。

可以尝试:mvn clean package -Pdist,native -DskipTests -Dtar

多来几次,就可以了。

赞(0)
【声明】:本博客不参与任何交易,也非中介,仅记录个人感兴趣的主机测评结果和优惠活动,内容均不作直接、间接、法定、约定的保证。访问本博客请务必遵守有关互联网的相关法律、规定与规则。一旦您访问本博客,即表示您已经知晓并接受了此声明通告。