Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop: require root's password after enter "start-all.sh"

I have installed Hadoop and SSH on my laptop. "ssh localhost" works fine. After formatting HDFS, I tried to start hadoop.

munichong@GrindPad:~$ sudo /usr/sbin/start-all.sh
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
root@localhost's password: 
root@localhost's password: localhost: Permission denied, please try again.

localhost: Permission denied (publickey,password).

It requires password. My role is "munichong". But munichong's password does not work here. Here, my role has changed to "root". I do not know whether I missed something here.

Is there anyone can help me?

Thanks!

like image 639
Munichong Avatar asked Mar 04 '13 05:03

Munichong


3 Answers

Solution:

1) Generate ssh key without password

$ ssh-keygen -t rsa -P ""

2) Copy id_rsa.pub to authorized-keys

$  cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys

3) Start ssh localhost

$ ssh localhost

4) now go to the hadoop sbin directory and start hadoop

$./start-all.sh 
./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-namenode-amtex-desktop.out
localhost: starting datanode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-datanode-amtex-desktop.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-secondarynamenode-amtex-desktop.out
starting yarn daemons
starting resourcemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-resourcemanager-amtex-desktop.out
localhost: starting nodemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-nodemanager-amtex-desktop.out

5)password not asking

$ jps 
12373 Jps
11823 SecondaryNameNode
11643 DataNode
12278 NodeManager
11974 ResourceManager
11499 NameNode
like image 113
KARTHIKEYAN.A Avatar answered Sep 30 '22 21:09

KARTHIKEYAN.A


As in above case munichong is a user (munichong@GrindPad)

  1. In my case: Login as hduser

  2. Firstly, remove the directorysudo rm -rf ~/.ssh

  3. Use to re-generate /.ssh directory with default setting:

    [hduser@localhost ~]$ ssh-keygen
    
  4. Here we do copy and paste the content of id_rsa.pub into authorised_keys file created by using above command)

    [hduser@localhost ~]$ sudo cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    
  5. [hduser@localhost ~]$ chmod -R 750 ~/.ssh/authorized_keys

  6. [hduser@localhost ~]$ ssh localhost

    The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is 04:e8:80:64:dc:71:b5:2f:c0:d9:28:86:1f:61:60:8a. Are you sure you want to continue connecting (yes/no)? yes

    Warning: Permanently added 'localhost' (RSA) to the list of known hosts. Last login: Mon Jan 4 14:31:05 2016 from localhost.localdomain

  7. [hduser@localhost ~]$ jps
    18531 Jps

  8. [hduser@localhost ~]$ start-all.sh

  9. All daemons start

Note: Sometime due to logs files other problem occur, in that case remove only dot out (.out) files from /usr/local/hadoop/logs/.

like image 29
Nishant Shrivastava Avatar answered Sep 30 '22 22:09

Nishant Shrivastava


I ran into the same problem. As Amar said,if you are running as sudo hadoop will ask for root password. If you don't have a root password, you can setup one using

 sudo passwd

below URL gives you more detail about user management.

https://help.ubuntu.com/12.04/serverguide/user-management.html

like image 3
javamak Avatar answered Sep 30 '22 23:09

javamak