Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get hadoop configuration in Java util

Tags:

hadoop

hdfs

I'm writing a Java utility that needs to access the DFS, so I need a Configuration object. When I create one simply by using

Configuration conf = new Configuration()

it doesn't seem to find the DFS, and just uses the local file system; printing

fs.getHomeDirectory()

gives my local home directory. I've tried adding core-site.xml,mapred-site.xml,yarn-site.xml,and hdfs-site.xml to the Configuration as resources, but it doesn't change anything. What do I need to do to get it to pick up the HDFS settings?

Thanks for reading

like image 853
user1111284 Avatar asked Jul 22 '15 15:07

user1111284


1 Answers

The reason why it's pointing to your local file system is core-site.xml and hdfs-site.xml is not added properly. Below code snippet will help you.

Configuration conf = new Configuration();
conf.addResource(new Path("file:///etc/hadoop/conf/core-site.xml")); // Replace with actual path
conf.addResource(new Path("file:///etc/hadoop/conf/hdfs-site.xml")); // Replace with actual path

Path pt = new Path("."); // HDFS Path
FileSystem fs = pt.getFileSystem(conf);

System.out.println("Home directory :"+fs.getHomeDirectory());

Update :

Above option should've worked, It seems some issues in the configuration file or path. You have another option instead of adding configuration files using addResource method, use set method. Open your core-site.xml file and find the value of fs.defaultFS. Use set method instead of addResource method.

conf.set("fs.defaultFS","hdfs://<Namenode-Host>:<Port>");  // Refer you core-site.xml file and replace <Namenode-Host> and <Port> with your cluster namenode and Port (default port number should be `8020`). 
like image 59
SachinJ Avatar answered Oct 06 '22 01:10

SachinJ