I know this has been asked before but I could not figure out the solution. I am getting the below error when I am trying to run hdfs name node -format
:
Could not find or load main class org.apache.hadoop.hdfs.server.namenode.Namenode
I followed the instructions from this website to install on my centos machine. The only difference is that I installed using root instead of hadoopuser as mentioned in the link.
# User specific aliases and functions
export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
export HADOOP_INSTALL=/usr/local/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export PATH=$PATH:$HADOOP_INSTALL/sbin
export PATH=$PATH:$HADOOP_INSTALL/bin
export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>file:///home/hadoopspace/hdfs/namenode</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>file:///home/hadoopspace/hdfs/datanode</value>
</property>
For anyone still having trouble, you need to export the HADOOP_PREFIX
environment variable.
Add the following line to your ~/.bashrc
file:
export HADOOP_PREFIX=/path_to_hadoop_location
# for example:
# export HADOOP_PREFIX=/home/mike/hadoop-2.7.1
Then do . ~/.bashrc
in your terminal and try again, this will fix the error.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With