Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop: start-dfs.sh permission denied

Tags:

hadoop

I am installing Hadoop on my laptop. SSH works fine, but I cannot start hadoop.

munichong@GrindPad:~$ ssh localhost
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0-25-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

0 packages can be updated.
0 updates are security updates.

Last login: Mon Mar  4 00:01:36 2013 from localhost

munichong@GrindPad:~$ /usr/sbin/start-dfs.sh
chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
starting namenode, logging to /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-namenode.pid: Permission denied
usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out: Permission denied
head: cannot open `/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting datanode, logging to /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-datanode.pid: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting secondarynamenode, logging to /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-secondarynamenode.pid: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out' for reading: No such file or directory

munichong@GrindPad:~$ sudo /usr/sbin/start-dfs.sh
[sudo] password for munichong: 
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
localhost: Permission denied (publickey,password).
localhost: Permission denied (publickey,password).

I used "sudo". But the permission is still denied.

Is there anyone can help me?

Thanks in advance!

like image 568
Munichong Avatar asked Mar 04 '13 21:03

Munichong


5 Answers

I was stuck at the same issue for last couple of hours but finally solved it. I had the hadoop installation extracted by same user as one I am using to run hadoop. So user privilege is not issue.
My cofiguration is like this: Ubuntu linux machine on Google Cloud.

Hadoop installation /home/ Hadoop data directory /var/lib/hadoop and the directory access bits are 777 so anybody can access. I did ssh into the remote machine made changes to the config files and executed start-dfs.sh, then it gave me "Permission denied (Public key)" So here is the solution: In the same ssh terminal:

  1. ssh-keygen

2.It will ask for folder location where it will copy the keys, I entered /home/hadoop/.ssh/id_rsa

3.it will ask for pass phrase, keep it empty for simplicity.

4.cat /home/hadoop/.ssh/id_rsa.pub >> .ssh/authorized_keys (To copy the newly generated public key to auth file in your users home/.ssh directory)

  1. ssh localhost

  2. start-dfs.sh (Now it should work!)

like image 98
Nealesh Avatar answered Nov 13 '22 19:11

Nealesh


I faced same problem, so tried to connect SSH and got statement like "not found," so I went to the ssh location to debug by the following steps:

cd ~/.ssh

ssh-keygen -t rsa -p""

cat id_rsa.pub >> authorized_keys

... then it worked ...

like image 36
arunava maiti Avatar answered Nov 13 '22 19:11

arunava maiti


Try to change the ownership of the folder: /var/log/hadoop/root to the user: munichong. As on all systems the LOGS directory needs to be edited by hadoop. So it requires the permission to edit the LOG folder and its contents.

sudo will not work in this case as this requires to have the permission of changing the folder contents even after this script finishes its work i.e to start HADOOP services in the background.

like image 5
Milind Jindal Avatar answered Nov 13 '22 20:11

Milind Jindal


You are trying to ssh to your own machine (localhost) and missing the authorized_keys file which allows login.

This file in SSH specifies the SSH keys that can be used for logging into the user account for which the file is configured.

Follow the below two steps to configure it correctly.

Generate new keygen with the below command in terminal:

ssh-keygen

Press enter so as to retain the default name id_rsa.pub

Now register the generated key file:

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
like image 4
Muthukrishnan Avatar answered Nov 13 '22 19:11

Muthukrishnan


Well I am now facing with this problem either, and before I got in this question, I use the method below.

  1. sudo -s -H

use this code to login as root user

  1. ssh localhost

login by using ssh (if you are just trying to use single node mode)

  1. ./sbin/start-dfs.sh

./sbin/start-yarn.sh

"cd" to your Hadoop installation route then print that code to start the HDFS&MapRedude , then you won't face the permittion problem again.

I guess the cause of this problem :

I use the root user to init the Hadoop environment, so the several folders were create by root user, so that When I now using my own account like 'Jake', I don't have permit to start the service(During that time the system need to access the LOGS )

enter image description here

like image 1
koalagreener Avatar answered Nov 13 '22 20:11

koalagreener