i'm trying to run 3 node Hadoop cluster on Windows Azure cloud. I've gone through configuration, and test launch. Everything look fine, however, as i used to use OpedJDK which is not recommended as VM for Hadoop according to what i read, i decide to replace it with Oracle Server JVM. Removed old installation of java with Yum, along with all java folders in /usr/lib, installed most recent version of Oracle JVM, updated PATH and JAVA_HOME variables; however, now on launch i getting following masseges:
sed: -e expression #1, char 6: unknown option to `s'
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
Server: ssh: Could not resolve hostname Server: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
e.t.c. (in total about 20-30 strings with words which should not have anything in common with hostnames)
For me it looks like it's trying to pass part of code as Hostname because of incorrect usage of sed in start up script:
if [ "$HADOOP_SLAVE_NAMES" != '' ] ; then
SLAVE_NAMES=$HADOOP_SLAVE_NAMES
else
SLAVE_FILE=${HADOOP_SLAVES:-${HADOOP_CONF_DIR}/slaves}
SLAVE_NAMES=$(cat "$SLAVE_FILE" | sed 's/#.*$//;/^$/d')
fi
# start the daemons
for slave in $SLAVE_NAMES ; do
ssh $HADOOP_SSH_OPTS $slave $"${@// /\\ }" \
2>&1 | sed "s/^/$slave: /" &
if [ "$HADOOP_SLAVE_SLEEP" != "" ]; then
sleep $HADOOP_SLAVE_SLEEP
fi
done
Which looks unchanged, so the question is: how change of JVM could affect sed? And how can i fix it?
So i found an answer to this question: My guess was wrong, and everything with sed is fine. Problem however was in how Oracle JVM works with external libraries compare to OpenJDK. It did throw exception where script was not expecting it, and it ruin whole sed input. You can fix it by adding following system variables: HADOOP_COMMON_LIB_NATIVE_DIR which should point to /lib/native folder of your Hadoop installation and add -Djava.library.path=/opt/hadoop/lib to whatever options you already have in HADOOP_OPTS variable (notice that /opt/hadoop is my installation folder, you might need to change it in order for stuff to work properly). I personally add export commands to hadoop-env.sh script, but adding it to .bash file or start-all.sh should work as well.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With