Let's install Hadoop and other components such as NameNode, DataNode, MapReduce, secondary NameNode, and so on using the yum
command available in RHEL distributions:
sudo yum clean all; sudo yum install hadoop-hdfs-namenode sudo yum install hadoop-hdfs-secondarynamenode sudo yum install hadoop-0.20-mapreduce-tasktrackerhadoop-hdfs-datanode sudo yum install hbase
rpm –qa<hadoop/hbase>
command:You can start and stop the processes using the following commands:
/usr/lib/hadoop/bin/hadoop-daemon.sh<start/stop><daemon name> /usr/lib/hbase/bin/hbase-daemon.sh<start/stop><daemon name>
If we use start-all.sh
, start-dfs.sh
, start-mapred.sh
, or start-yarn.sh
to start the HBase/Hadoop cluster, it takes care of the sequence of starting and stopping the process. However, if we use Hadoop-daemon.sh
or HBase-daemon.sh
to start Hadoop/HBase processes, we should follow the following sequence:
18.219.71.21