Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Incorrect configuration: namenode address dfs.namenode.rpc-address is not configured

I am getting this error when I try and boot up a DataNode. From what I have read, the RPC paramters are only used for a HA configuration, which I am not setting up (I think).

2014-05-18 18:05:00,589 INFO  [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(572)) - DataNode metrics system shutdown complete.
2014-05-18 18:05:00,589 INFO  [main] datanode.DataNode (DataNode.java:shutdown(1313)) -     Shutdown complete.
2014-05-18 18:05:00,614 FATAL [main] datanode.DataNode (DataNode.java:secureMain(1989)) - Exception in secureMain
java.io.IOException: Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
at org.apache.hadoop.hdfs.DFSUtil.getNNServiceRpcAddresses(DFSUtil.java:840)
at   org.apache.hadoop.hdfs.server.datanode.BlockPoolManager.refreshNamenodes(BlockPoolManager.java:151)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:745)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:278)

My files look like:

[root@datanode1 conf.cluster]# cat core-site.xml

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

<property>
 <name>fs.defaultFS</name>
 <value>hdfs://namenode:8020</value>
</property>

</configuration>

cat hdfs-site.xml

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
<property>
 <name>dfs.datanode.data.dir</name>
 <value>/hdfs/data</value>
</property>
<property>
 <name>dfs.permissions.superusergroup</name>
 <value>hadoop</value>
</property>
</configuration>

I am using the latest CDH5 distro.

Installed Packages
Name        : hadoop-hdfs-datanode
Arch        : x86_64
Version     : 2.3.0+cdh5.0.1+567
Release     : 1.cdh5.0.1.p0.46.el6

Any helpful advice on how to get past this?

EDIT: Just use Cloudera manager.

like image 783
aaa90210 Avatar asked May 18 '14 08:05

aaa90210


2 Answers

These steps solved the problem for me:

  • export HADOOP_CONF_DIR = $HADOOP_HOME/etc/hadoop
  • echo $HADOOP_CONF_DIR
  • hdfs namenode -format
  • hdfs getconf -namenodes
  • ./start-dfs.sh
like image 134
Hamdi Charef Avatar answered Sep 22 '22 12:09

Hamdi Charef


I too was facing the same issue and finally found that there was a space in fs.default.name value. truncating the space fixed the issue. The above core-site.xml doesn't seem to have space so the issue may be different from what i had. my 2 cents

like image 24
QADeveloper Avatar answered Sep 21 '22 12:09

QADeveloper