Hadoop source code compilation

Posted by henryblake1979 on Tue, 30 Jun 2020 10:51:58 +0200

No one has to help you. Everything has to be done by yourself

Hadoop source code compilation

preparation

(1) CentOS networking

Configure CentOS to connect to the Internet. Linux virtual machine ping is smooth
 Note: use root role compilation to reduce the folder permissions

(2) jar package preparation (hadoop source code, JDK8, maven, ant, protobuf)

(1)hadoop-2.7.2-src.tar.gz
(2)jdk-8u144-linux-x64.tar.gz
 (3)apache-ant-1.9.9-bin.tar.gz (build tool, for packing)
(4)apache-maven-3.0.5-bin.tar.gz
 (5)protobuf-2.5.0.tar.gz (serialized framework)

Compiler installation

(1) Install JDK

[root@hadoop101 software] # tar -zxf jdk-8u144-linux-x64.tar.gz -C /opt/module/

[root@hadoop101 software]# vi /etc/profile

\#JAVA_HOME: 

export JAVA_HOME=/opt/module/jdk1.8.0_144

export PATH=$PATH:$JAVA_HOME/bin

[root@hadoop101 software]#source /etc/profile

//Verification command: java -version

(2) Maven decompress and configure MAVEN_HOME and PATH

[root@hadoop101 software]# tar -zxvf apache-maven-3.0.5-bin.tar.gz -C /opt/module/

[root@hadoop101 apache-maven-3.0.5]# vi conf/settings.xml

<mirrors>

  <!-- mirror

   | Specifies a repository mirror site to use instead of a given repository. The repository that

   | this mirror serves has an ID that matches the mirrorOf element of this mirror. IDs are used

   | for inheritance and direct lookup purposes, and must be unique across the set of mirrors.

   |

<mirror>

​    <id>mirrorId</id>

​    <mirrorOf>repositoryId</mirrorOf>

​    <name>Human Readable Name for this Mirror.</name>

​    <url>http://my.repository.com/repo/path</url>

   </mirror>

   -->

​    <mirror>

​        <id>nexus-aliyun</id>

​        <mirrorOf>central</mirrorOf>

​        <name>Nexus aliyun</name>

​        <url>http://maven.aliyun.com/nexus/content/groups/public</url>

​    </mirror>

</mirrors>


[root@hadoop101 apache-maven-3.0.5]# vi /etc/profile

\#MAVEN_HOME

export MAVEN_HOME=/opt/module/apache-maven-3.0.5

export PATH=$PATH:$MAVEN_HOME/bin


[root@hadoop101 software]#source /etc/profile

//Verification command: mvn -version

(3) Configure ANT

[root@hadoop101 software]# tar -zxvf apache-ant-1.9.9-bin.tar.gz -C /opt/module/
[root@hadoop101 apache-ant-1.9.9]# vi /etc/profile

\#ANT_HOME

export ANT_HOME=/opt/module/apache-ant-1.9.9

export PATH=$PATH:$ANT_HOME/bin


[root@hadoop101 software]#source /etc/profile
//Verification command: ant -version

(4) Installing glibc headers and g++

[root@hadoop101 apache-ant-1.9.9]# yum install glibc-headers

[root@hadoop101 apache-ant-1.9.9]# yum install gcc-c++

(5) Install make and cmake

[root@hadoop101 apache-ant-1.9.9]# yum install make

[root@hadoop101 apache-ant-1.9.9]# yum install cmake

(6) Install protobuf

[root@hadoop101 software]# tar -zxvf protobuf-2.5.0.tar.gz -C /opt/module/

[root@hadoop101 opt]# cd /opt/module/protobuf-2.5.0/

[root@hadoop101 protobuf-2.5.0]#./configure 

[root@hadoop101 protobuf-2.5.0]# make 

[root@hadoop101 protobuf-2.5.0]# make check 

[root@hadoop101 protobuf-2.5.0]# make install 

[root@hadoop101 protobuf-2.5.0]# ldconfig 

[root@hadoop101 hadoop-dist]# vi /etc/profile

\#LD_LIBRARY_PATH

export LD_LIBRARY_PATH=/opt/module/protobuf-2.5.0

export PATH=$PATH:$LD_LIBRARY_PATH

 
[root@hadoop101 software]#source /etc/profile

//Verification command: protoc --version

(7) Install openssl Library

[root@hadoop101 software]#yum install openssl-devel

(8) Install the ncurses devel Library

[root@hadoop101 software]#yum install ncurses-devel
 At this point, the compiler installation is basically completed

Compile source code

(1) Unzip the source code to the / opt / directory

[root@hadoop101 software]# tar -zxvf hadoop-2.7.2-src.tar.gz -C /opt/

(2) Enter the hadoop source code main directory

[root@hadoop101 hadoop-2.7.2-src]# pwd

/opt/hadoop-2.7.2-src

(3) Executing the compile command through maven

[root@hadoop101 hadoop-2.7.2-src]#mvn package -Pdist,native -DskipTests -Dtar

The waiting time is about 30 minutes, and the final SUCCESS is all SUCCESS
# The successful 64 bit Hadoop package is under / opt / hadoop-2.7.2-src / Hadoop dist / target
[root@hadoop101 target]# pwd
/opt/hadoop-2.7.2-src/hadoop-dist/target

(4) Common problems and solutions in the process of compiling source code

(1) JVM memory overflow during MAVEN install

Processing method: MAVEN can be adjusted in both environment configuration file and MAVEN execution file_ The heap size of opt. (for details, please refer to MAVEN compilation JVM tuning issues, such as: http://outofmemory.cn/code-snippet/12652/maven-outofmemoryerror-method )

(2) maven reported an error during compilation. It is possible that network congestion may lead to incomplete downloading of dependent library, and the command can be executed several times (it is difficult to pass the command once)
[root@hadoop101 hadoop-2.7.2-src]#mvn package -Pdist,nativeN -DskipTests -Dtar

(3) Report ant, protobuf and other errors, plug-in download is not complete or plug-in version problem, at the beginning of the link there are more special circumstances, and recommended

Issue summary post of version 2.7.0 http://www.tuicool.com/articles/IBn63qf

Related information

This paper is supported by GitHub https://github.com/zhutiansama/FocusBigData

Supporting official account: FocusBigData

Reply to big data interview experience big data learning Roadmap

Topics: Java Hadoop Maven Apache yum