我们专注攀枝花网站设计 攀枝花网站制作 攀枝花网站建设
成都网站建设公司服务热线:400-028-6601

网站建设知识

十年网站开发经验 + 多家企业客户 + 靠谱的建站团队

量身定制 + 运营维护+专业推广+无忧售后,网站问题一站解决

安装hadoop过程详解

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz  hadoop的下载文件

让客户满意是我们工作的目标,不断超越客户的期望值来自于我们对这个行业的热爱。我们立志把好的技术通过有效、简单的方式提供给客户,将通过不懈努力成为客户在信息化领域值得信任、有价值的长期合作伙伴,公司提供的服务项目有:域名与空间、网站空间、营销软件、网站建设、西工网站维护、网站推广。

安装jdk

http://www.linuxidc.com/Linux/2014-08/105906.htm 

安装hadoop

进入

/root/zby/hadoop/hadoop-1.2.1/conf

配置hadoop,主要是配置core-site.xml,hdfs-site.xml,mapred-site.xml三个配置文件

4个文件需要编辑:

第一个文件改个jdk按照路径即可

hadoop-env.sh

export HADOOP_HEAPSIZE=256  修改hadoop所用内存

#export JAVA_HOME=/usr/lib/jvm/jdk7   这行需要编辑

路径不知道可以用如下命令进行查找

[root@iZ28c21psoeZ conf]# echo $JAVA_HOME

/usr/lib/jvm/jdk7

第二个文件:打开文件直接进行替换,如下中文 注释都删除后粘贴。。。

cd /opt/hadoop-1.2.1/conf 

vim core-site.xml

hadoop.tmp.dir

/hadoop

dfs.name.dir

hadoop/name

第三个文件:如下中文 注释都删除后粘贴。。。

vim hdfs-site.xml

dfs.data.dir

/hadoop/data

第四个文件:如下中文注释都删除后粘贴。。。

vim mapred-site.xml

 

mapred.job.tracker

ldy:9001

接下来还需要修改下vim /etc/profile

将如下代码放置在最后,如果前5行在安装jdk时已经生效可以不用添加。

export JAVA_HOME=/usr/lib/jvm/jdk7

export JRE_HOME=${JAVA_HOME}/jre

export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib

export PATH=${JAVA_HOME}/bin:$PATH

export HADOOP_HOME=/opt/hadoop-1.2.1

export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$PATH

接下来 进入该目录:

/opt/hadoop-1.2.1/bin

对hadoop进行一个格式化操作:

hadoop -namenode -format

如果遇到如下错误:

Warning: $HADOOP_HOME is deprecated.

/opt/hadoop-1.2.1/bin/hadoop: line 350: /usr/lib/jdk7/bin/java: No such file or directory

/opt/hadoop-1.2.1/bin/hadoop: line 434: /usr/lib/jdk7/bin/java: No such file or directory

/opt/hadoop-1.2.1/bin/hadoop: line 434: exec: /usr/lib/jdk7/bin/java: cannot execute: No such file or directory

查看第一个文件是否正确 

[root@iZ28c21psoeZ conf]# echo $JAVA_HOME

/usr/lib/jvm/jdk7

接着执行,又报错了。。

[root@iZ28c21psoeZ bin]# hadoop -namenode -format

Warning: $HADOOP_HOME is deprecated.

Unrecognized option: -namenode

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

[root@iZ28c21psoeZ bin]#

可以修改的地方有两个
      第一个(次要的):/opt/hadoop/conf/hadoop-env.sh

       修改参数: export HADOOP_HEAPSIZE=256   #默认值为2000M,为Java虚拟机占用的内存的大小 

  第二个(主要的):将如下源码放在hadoop最下方保存

       查看/opt/hadoop/bin/hadoop 源码:
       ####################################################################
       if [[ $EUID -eq 0 ]]; then
           HADOOP_OPTS="$HADOOP_OPTS -jvm server $HADOOP_DATANODE_OPTS"
       else
           HADOOP_OPTS="$HADOOP_OPTS -server $HADOOP_DATANODE_OPTS"
       fi

       ####################################################################


重新执行,看看结果,貌似又报错了。

[root@iZ28c21psoeZ bin]# ./hadoop namenode -format

Warning: $HADOOP_HOME is deprecated.


16/07/04 18:49:04 INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG: Starting NameNode

STARTUP_MSG:   host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG:   args = [-format]

STARTUP_MSG:   version = 1.2.1

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG:   java = 1.7.0_60

************************************************************/

[Fatal Error] core-site.xml:11:3: The element type "property" must be terminated by the matching end-tag "".

16/07/04 18:49:04 FATAL conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "".

16/07/04 18:49:04 ERROR namenode.NameNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "".

        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1249)

        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)

        at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)

        at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)

Caused by: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "".

        at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)

        at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)

        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:177)

        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1156)

        ... 6 more


16/07/04 18:49:04 INFO namenode.NameNode: SHUTDOWN_MSG:

/************************************************************

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

************************************************************/

[root@iZ28c21psoeZ bin]#

 

根据日志提示是3大配置文件中有错误:

果然:

  写成了 

重新执行一遍看看:

[root@iZ28c21psoeZ bin]# ./hadoop namenode -format

Warning: $HADOOP_HOME is deprecated.

16/07/04 18:55:26 INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG: Starting NameNode

STARTUP_MSG:   host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG:   args = [-format]

STARTUP_MSG:   version = 1.2.1

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG:   java = 1.7.0_60

************************************************************/

16/07/04 18:55:27 INFO util.GSet: Computing capacity for map BlocksMap

16/07/04 18:55:27 INFO util.GSet: VM type       = 64-bit

16/07/04 18:55:27 INFO util.GSet: 2.0% max memory = 259522560

16/07/04 18:55:27 INFO util.GSet: capacity      = 2^19 = 524288 entries

16/07/04 18:55:27 INFO util.GSet: recommended=524288, actual=524288

16/07/04 18:55:32 INFO namenode.FSNamesystem: fsOwner=root

16/07/04 18:55:33 INFO namenode.FSNamesystem: supergroup=supergroup

16/07/04 18:55:33 INFO namenode.FSNamesystem: isPermissionEnabled=true

16/07/04 18:55:42 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100

16/07/04 18:55:42 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)

16/07/04 18:55:42 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0

16/07/04 18:55:42 INFO namenode.NameNode: Caching file names occuring more than 10 times

16/07/04 18:55:45 INFO common.Storage: Image file /hadoop/dfs/name/current/fsp_w_picpath of size 110 bytes saved in 0 seconds.

16/07/04 18:55:47 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/dfs/name/current/edits

16/07/04 18:55:47 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/dfs/name/current/edits

16/07/04 18:55:48 INFO common.Storage: Storage directory /hadoop/dfs/name has been successfully formatted.

16/07/04 18:55:48 INFO namenode.NameNode: SHUTDOWN_MSG:

/************************************************************

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

************************************************************/

完美:接着做:

cd /opt/hadoop-1.2.1/bin

[root@iZ28c21psoeZ bin]# start-all.sh

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-iZ28c21psoeZ.out

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

starting jobtracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-iZ28c21psoeZ.out

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

[root@iZ28c21psoeZ bin]#

翻译一下:

警告:$ HADOOP_HOME弃用。


namenode开始,日志/ opt / hadoop-1.2.1 / libexec / . . /日志/ hadoop-root-namenode-iZ28c21psoeZ.out
localhost:套接字:家庭地址不支持的协议
localhost:ssh连接到主机本地主机端口22:家庭地址不支持的协议
localhost:套接字:家庭地址不支持的协议
localhost:ssh连接到主机本地主机端口22:家庭地址不支持的协议
jobtracker开始,日志/ opt / hadoop-1.2.1 / libexec / . . /日志/ hadoop-root-jobtracker-iZ28c21psoeZ.out

localhost:套接字:家庭地址不支持的协议 

在修改下代码:

根据日志所示是端口不对,将hadoop的端口改成和服务器的ssh端口一致即可。

在conf/hadoop-env.sh里改下 新增一条  export HADOOP_SSH_OPTS="-p 1234"

在执行一下:

[root@ldy bin]# sh start-all.sh

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-ldy.out

localhost: starting datanode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-datanode-ldy.out

localhost: starting secondarynamenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-secondarynamenode-ldy.out

starting jobtracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-ldy.out

localhost: starting tasktracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-tasktracker-ldy.out

[root@ldy bin]# jps

27054 DataNode

26946 NameNode

27374 TaskTracker

27430 Jps

27250 JobTracker

27165 SecondaryNameNode

ok现在6个端口都起来了,成功。。


网页名称:安装hadoop过程详解
标题来源:http://mswzjz.cn/article/jcddds.html

其他资讯