Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does "hadoop fs -mkdir" fail with Permission Denied?

I am using Cloudera on a VM machine that I am playing around with. Unfortunately I am having issues copying data to the HDFS, I am getting the following:

[cloudera@localhost ~]$ hadoop fs -mkdir input
mkdir: Permission denied: user=cloudera, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

I am not too concerned about security on this VM, is there anyway I can open up security more on HDFS?

like image 733
AAA Avatar asked Mar 27 '14 01:03

AAA


People also ask

How do I change ownership of HDFS folder?

Changing the owner of files in the HDFS: Firstly, switch to root user from ec2-user using the “sudo -i” command. And let us create a directory in the HDFS by changing it as the HDFS user. Commands for the same are listed below. Let us create a directory “test-dir” in the hdfs using the mkdir command.

What is ACL in Hadoop?

An ACL provides a way to set different permissions for specific named users or named groups, not only the file's owner and the file's group. By default, support for ACLs is enabled, and the NameNode allows creation of ACLs. To disable support for ACLs, set dfs.


3 Answers

Using mkdir in hadoop needs the "hadoop file permissions". From your example you can see that hdfs is a user that has permissions to create folders. So if you run:

sudo -u hdfs hadoop fs -mkdir /import

then the import folder will be created. If you want to change the owner of this folder run:

sudo -u hdfs hadoop fs -chown new_user /import

Now the new_user can manipulate files inside the import folder

like image 166
xYan Avatar answered Nov 16 '22 00:11

xYan


When you execute the above command, if hdfs home directory(/user/cloudera) is not there then that directory will be created first then the directory input will be created under /user/cloudera

For giving permission for cloudera user to create it's own directory, you got to give permission. hdfs user is the admin user in hdfs switch to hdfs then execute the following command

[hdfs@localhost~]$ hadoop fs -mkdir /user/cloudera ; hadoop fs -chmod 777  /user/cloudera

Or

if you are not too concerned about hdfs security you disable hdfs permission by setting the below property to false in hdfs-site.xml

<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>

after setting this property to false hdfs needs to be restarted.

like image 26
SachinJ Avatar answered Nov 16 '22 00:11

SachinJ


In cloudera manager, you can change the settings: hdfs->configuration->view&edit, uncheck the Check HDFS Permissions dfs.permissions and restart the hdfs.

like image 36
shakir544 Avatar answered Nov 16 '22 01:11

shakir544