I have a work in progress installation of Hadoop in Ubuntu 12.x. I already had a deploy
user which I plan to use to run hadoop in a cluster of machines. The following code demonstrate my problem basically I can ssh olympus
no problems but start-dfs.sh
fails doing exactly that:
deploy@olympus:~$ ssh olympus
Welcome to Ubuntu 12.04.4 LTS (GNU/Linux 3.5.0-45-generic x86_64)
* Documentation: https://help.ubuntu.com/
Last login: Mon Feb 3 18:22:27 2014 from olympus
deploy@olympus:~$ echo $JAVA_HOME
/opt/dev/java/1.7.0_51
deploy@olympus:~$ start-dfs.sh
Starting namenodes on [olympus]
olympus: Error: JAVA_HOME is not set and could not be found.
You can edit hadoop-env.sh file and set JAVA_HOME for Hadoop
Open the file and find the line as bellow
export JAVA_HOME=/usr/lib/j2sdk1.6-sun
Uncomment the line And update the java_home as per your environment
This will solve the problem with java_home.
Weird out of the box bug on Ubuntu. The current line
export JAVA_HOME=${JAVA_HOME}
in /etc/hadoop/hadoop-env.sh should pick up java home from host but it doesnt.
Just edit the file and hard code the java home for now.
Alternatively you can edit /etc/environment
to include:
JAVA_HOME=/usr/lib/jvm/[YOURJAVADIRECTORY]
This makes JAVA_HOME
available to all users on the system, and allows start-dfs.sh
to see the value. My guess is that start-dfs.sh
is kicking off a process as another user somewhere that does not pick up the variable unless explicitly set in hadoop-env.sh
.
Using hadoop-env.sh
is arguably clearer -- just adding this option for completeness.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With