Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to find HADOOP_HOME path on Linux?

Tags:

linux

hadoop

I am trying to run the below java code on a hadoop server.

javac -classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar -d wordcount_classes WordCount.java

but I am not able to locate {HADOOP_HOME}. I tried with hadoop -classpath but it is giving output as below:

/etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//*

Anyone has any idea about this?

like image 406
Anil Savaliya Avatar asked Feb 03 '15 23:02

Anil Savaliya


2 Answers

Navigate to the path where hadoop is installed. locate ${HADOOP_HOME}/etc/hadoop, e.g.

/usr/lib/hadoop-2.2.0/etc/hadoop

When you type the ls for this folder you should see all these files.

capacity-scheduler.xml      httpfs-site.xml
configuration.xsl           log4j.properties
container-executor.cfg      mapred-env.cmd
core-site.xml               mapred-env.sh
core-site.xml~              mapred-queues.xml.template
hadoop-env.cmd              mapred-site.xml
hadoop-env.sh               mapred-site.xml~
hadoop-env.sh~              mapred-site.xml.template
hadoop-metrics2.properties  slaves
hadoop-metrics.properties   ssl-client.xml.example
hadoop-policy.xml           ssl-server.xml.example
hdfs-site.xml               yarn-env.cmd
hdfs-site.xml~              yarn-env.sh
httpfs-env.sh               yarn-site.xml
httpfs-log4j.properties     yarn-site.xml~
httpfs-signature.secret

Core configuration settings are available in hadoop-env.sh.

You can see classpath settings in this file and I copied some sample here for your reference.

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_67

# The jsvc implementation to use. Jsvc is required to run secure datanodes.
#export JSVC_HOME=${JSVC_HOME}

export HADOOP_CONF_DIR=${HADOOP_CONF_DIR}

# Extra Java CLASSPATH elements.  Automatically insert capacity-scheduler.
for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do
    export HADOOP_CLASSPATH=${HADOOP_CLASSPATH+$HADOOP_CLASSPATH:}$f
done

Hope this helps!

like image 66
221B Avatar answered Oct 15 '22 23:10

221B


hadoop-core jar file is in ${HADOOP_HOME}/share/hadoop/common directory, not in ${HADOOP_HOME} directory.

You can set the environment variable in your .bashrc file.

vim ~/.bashrc

Then add the following line to the end of .bashrc file.

export HADOOP_HOME=/your/hadoop/installation/directory

Just replace the path with your hadoop installation path.

like image 34
Chong Tang Avatar answered Oct 16 '22 00:10

Chong Tang