I am following this page in the official documentation. I also downloaded hadoop 2.2.0 and placed it under $HOME/opt
. Now I have this file structure:
$ ls -1 ~/opt/hadoop-2.2.0/
LICENSE.txt
NOTICE.txt
README.txt
bin/
etc/
include/
lib/
libexec/
sbin/
share/
$ ls -1 ~/opt/hadoop-2.2.0/share/hadoop/
common/
hdfs/
httpfs/
mapreduce/
tools/
yarn/
In the page I mentioned above, there is this paragraph:
Assuming you have installed hadoop-common/hadoop-hdfs and exported $HADOOP_COMMON_HOME/$HADOOP_HDFS_HOME, untar hadoop mapreduce tarball and set environment variable $HADOOP_MAPRED_HOME to the untarred directory. Set $HADOOP_YARN_HOME the same as $HADOOP_MAPRED_HOME.
So, my question is, given my file structure, how should I set up the hadoop environment variables ($HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_YARN_HOME, etc)? Thank you very much.
You could set them as following, last 2 are not mandatory to get HDFS and YARN working though.
HADOOP_COMMON_HOME=$HOME/opt/hadoop-2.2.0/
HADOOP_HDFS_HOME=$HOME/opt/hadoop-2.2.0/share/hadoop/hdfs
HADOOP_YARN_HOME=$HOME/opt/hadoop-2.2.0/share/hadoop/yarn
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With