Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Accessing hadoop from remote machine

Tags:

java

hadoop

I have hadoop set up (pseudo distributed) on a server VM and I'm trying to use the Java API to access the HDFS.

The fs.default.name on my server is hdfs://0.0.0.0:9000 (as with localhost:9000 it wouldn't accept requests from remote sites).

I can connect to the server on port 9000

$ telnet srv-lab 9000
Trying 1*0.*.30.95...
Connected to srv-lab
Escape character is '^]'.
^C

which indicates to me that connection should work fine. The Java code I'm using is:

try {
        Path pt = new Path(
                "hdfs://srv-lab:9000/test.txt");
        Configuration conf = new Configuration();
        conf.set("fs.default.name", "hdfs://srv-lab:9000");
        FileSystem fs = FileSystem.get(conf);
        BufferedReader br = new BufferedReader(new InputStreamReader(
                fs.open(pt)));
        String line;
        line = br.readLine();
        while (line != null) {
            System.out.println(line);
            line = br.readLine();
        }
    } catch (Exception e) {
        e.printStackTrace();
    }

but what I get is:

java.net.ConnectException: Call From clt-lab/1*0.*.2*2.205 to srv-lab:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Thus, any hints on why the connection is refused even though connecting through telnet works fine?

like image 235
mroman Avatar asked Sep 26 '22 13:09

mroman


1 Answers

Your hdfs entry is wrong. fs.default.name has to be set as hdfs://srv-lab:9000. Set this and restart your cluster. that will fix the issue

like image 90
Thanga Avatar answered Sep 28 '22 06:09

Thanga