Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Change ssh default port in hadoop multi cluster [closed]

Tags:

ssh

hadoop

hbase

My Hadoop muti node cluster has 3 nodes, one namenode and two datanodes, I am using Hbase for storing data, due to some reasons I want to change default ssh port number which I know how to do, but if I change that, what configuration changes I will have to make in hadoop and hbase?

I saw link , this link just explains the change in configuration for hadoop, but I think configuration of Hbase, Zookeper and Yarn also needs to be changed. Am I right? If yes, what changes I need to do in hadoop and hbase?

Hadoop verison 2.7.1

HBase version 1.0.1.1

Help Appreciated :)

like image 748
Prashant Puri Avatar asked Oct 29 '25 09:10

Prashant Puri


1 Answers

SSH isn't a Hadoop managed configuration, and therefore has nothing to do with Spark, Hbase, Zookeper or Yarn other than adding new nodes to the cluster and inter-process communication.

You'll have to edit /etc/ssh/sshd_config on every node to change any SSH related settings. Then restart all the Hadoop services as well as sshd.

The related line is

Port 22

Change the port number, then do

sudo service sshd restart

In hadoop-env.sh there is the HADOOP_SSH_OPTS environment variable. I'm not really sure what it does, but you are welcome to try and set a port like so.

export HADOOP_SSH_OPTS="-p <num>"

Also not sure about this one, but in hbase-env.sh

export HBASE_SSH_OPTS="-p <num>"

Once done setting all the configs, restart the Hadoop services

stop-all.sh
start-all.sh
like image 162
OneCricketeer Avatar answered Oct 31 '25 21:10

OneCricketeer