I have installed Hadoop and SSH on my laptop. "ssh localhost" works fine. After formatting HDFS, I tried to start hadoop.
munichong@GrindPad:~$ sudo /usr/sbin/start-all.sh
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
root@localhost's password:
root@localhost's password: localhost: Permission denied, please try again.
localhost: Permission denied (publickey,password).
It requires password. My role is "munichong". But munichong's password does not work here. Here, my role has changed to "root". I do not know whether I missed something here.
Is there anyone can help me?
Thanks!
Solution:
1) Generate ssh key without password
$ ssh-keygen -t rsa -P ""
2) Copy id_rsa.pub to authorized-keys
$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
3) Start ssh localhost
$ ssh localhost
4) now go to the hadoop sbin directory and start hadoop
$./start-all.sh
./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-namenode-amtex-desktop.out
localhost: starting datanode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-datanode-amtex-desktop.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-secondarynamenode-amtex-desktop.out
starting yarn daemons
starting resourcemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-resourcemanager-amtex-desktop.out
localhost: starting nodemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-nodemanager-amtex-desktop.out
5)password not asking
$ jps
12373 Jps
11823 SecondaryNameNode
11643 DataNode
12278 NodeManager
11974 ResourceManager
11499 NameNode
As in above case munichong is a user (munichong@GrindPad)
In my case: Login as hduser
Firstly, remove the directorysudo rm -rf ~/.ssh
Use to re-generate /.ssh directory with default setting:
[hduser@localhost ~]$ ssh-keygen
Here we do copy and paste the content of id_rsa.pub into authorised_keys file created by using above command)
[hduser@localhost ~]$ sudo cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
[hduser@localhost ~]$ chmod -R 750 ~/.ssh/authorized_keys
[hduser@localhost ~]$ ssh localhost
The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is 04:e8:80:64:dc:71:b5:2f:c0:d9:28:86:1f:61:60:8a. Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts. Last login: Mon Jan 4 14:31:05 2016 from localhost.localdomain
[hduser@localhost ~]$ jps
18531 Jps
[hduser@localhost ~]$ start-all.sh
All daemons start
Note: Sometime due to logs files other problem occur, in that case remove only dot out (.out) files from /usr/local/hadoop/logs/.
I ran into the same problem. As Amar said,if you are running as sudo hadoop will ask for root password. If you don't have a root password, you can setup one using
sudo passwd
below URL gives you more detail about user management.
https://help.ubuntu.com/12.04/serverguide/user-management.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With