I am trying to install Hadoop 2.2.0 Cluster on the servers. For now all the servers are 64-bit, I download the Hadoop 2.2.0 and all the configuration files have been set up. When I am running ./start-dfs.sh, I got the following error:
13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...
Beside the 64-bit, is there any other errors? I have finished the log in between namenode and datanodes without password, what do the other errors mean?
Add the following entries to .bashrc where HADOOP_HOME is your hadoop folder:
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
In addition, execute the following commands:
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
The root cause is that the default native library in hadoop is built for 32-bit. The solution
1) Setup some environment variables in .bash_profile
. Please refer to https://gist.github.com/ruo91/7154697
Or
2) Rebuild your hadoop native library, please refer to http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html
You also can export variables in hadoop-env.sh
vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh
/usr/local/hadoop - my hadoop installation folder
#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
I think that the only problem here is the same as in this question, so the solution is also the same:
Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.
Do it by replacing in your etc/hadoop/hadoop-env.sh
line:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
with:
export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"
(This solution has been found on Sumit Chawla's blog)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With