hadoop集群搭建的步骤

1.安装jdk
2修改ip地址
3.关闭防火墙
4.修改hostname
5.设置ssh自动登陆
6.安装hadoop
-----------------------------------------------------------------------
1.1安装jdk

 
传jdk-6u24-linux-i586.bin到/home/





 
#cd /home/
#./jdk-6u24-linux-i586.bin
#mv jdk-6u24-linux-i586.bin jdk
#vi /etc/profile
export JAVA_HOME=/home/jdk 
export PATH=$JAVA_HOME/bin:$PATH
退
#source /etc/profile
#java -version
 

1.2修改ip地址

    ifcfg-eth0考:
    vim /etc/sysconfig/network-scripts/ifcfg-eth0

 DEVICE="eth0"
BOOTPROTO="static"
ONBOOT="yes"
TYPE="Ethernet"
IPADDR=192.168.8.100
PREFIX=24
GATEWAY=192.168.8.1
1.3关闭防火墙,修改主机名(hostname)
 
 
#hostname <>
#vi /etc/sysconfig/network
HOSTNAME=<>     退
/etc/hosts
 
改/etc/sysconfig/network-scripts/
#service iptables stop


1.5.设置ssh免密码登录
  Hadoop运行过程中需要管理远端Hadoop守护进程,在Hadoop启动以后,NameNode是通过SSH(Secure Shell)来无密码登录启动和停止各个DataNode上的各种守护进程的同样原理,DataNode上也能使用SSH无密码登录到NameNode。

 
namenodedatanode
#ssh-keygen  -rsa
~/.ssh/id_rsa  id_rsa.pub
namenode
#cd ~/.ssh/
#scp id_rsa.pub root@<datanodeIP>:/home
datanode
#cd /home/
#cat id_rsa.pub >>/root/.ssh/authorized_keys
 

1.6.安装hadoop
    1.6.1在namenode上安装hadoop

 
HadoopHADOOP_HOME/HOME/hadoop
1.hadoop-1.0.4.tar.gz到/home
#cp hadoop-1.0.4.tar.gz /home
2.
#cd /home
#tar -zxvf hadoop-1.0.4.tar.gz
#mv hadoop-1.0.hadoop
3./etc/profile
#vi /etc/profile
export JAVA_HOME=/home/java
export HADOOP_HOME=/home/hadoop
export PATH=$JAVA_HOME/bin:$PATH:$HADOOP_HOME/bin
退
#source /etc/profile
 1.6.3修改hadoop的配置文件
 
1.conf/hadoop-env.sh
export JAVA_HOME=/home/java
export HADOOP_HEAPSIZE=1024
export HADOOP_PID_DIR=/home/hadoop/pids
退
2.conf/core-site.xml
<property>
  <name>fs.default.name</name>
  <value>hdfs://hadoop00:9000</value>
</property>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/home/hadoop/tmp</value>
</property>
3.conf/hdfs-site.xml
<property>
  <name>dfs.replication</name>
  <value>2</value>
</property>
4.conf/mapred-site.xml
<property>
  <name>mapred.job.tracker</name>
  <value>hdfs://hadoop00:9001/</value>
</property>




 

 

 
5.master
conf/masters
hadoop00
6.pei'zhislaves
conf/slaves
hadoop01
hadoop02
hadoop03
7:hadoopdatanode
hadoop00hadoopjdk、/etc/hosts、/etc/profilehadoop01hadoop02hadoop03
#cd $HADOOP_HOME/..
#scp -hadoop hadoop01:/home
#scp -hadoop hadoop02:/home
#scp -hadoop hadoop03:/home
 

1.6.4启动和停止hadoop集群

 
#hadoop dfsadmin -report
hadoop0
Hadoop,namenode
#cd $HADOOP_HOME /bin
#hadoop namenode –format
Hadoop
#cd $HADOOP_HOME/bin
#./start-all.sh
safemodeException
hadoop dfsadmin -safemode leave
Hadoop
Hadoop
cd $HADOOP_HOME/bin
#./stop-all.sh
 
posted on 2015-05-25 17:23  张释文  阅读(224)  评论(0)    收藏  举报