I am using docker-spark. After starting spark-shell
, it outputs:
15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path
15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
The environment variables of this spark container
are:
bash-4.1# export
declare -x BOOTSTRAP="/etc/bootstrap.sh"
declare -x HADOOP_COMMON_HOME="/usr/local/hadoop"
declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
declare -x HADOOP_HDFS_HOME="/usr/local/hadoop"
declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop"
declare -x HADOOP_PREFIX="/usr/local/hadoop"
declare -x HADOOP_YARN_HOME="/usr/local/hadoop"
declare -x HOME="/"
declare -x HOSTNAME="sandbox"
declare -x JAVA_HOME="/usr/java/default"
declare -x OLDPWD
declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin"
declare -x PWD="/"
declare -x SHLVL="3"
declare -x SPARK_HOME="/usr/local/spark"
declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar"
declare -x TERM="xterm"
declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop"
After referring Hadoop “Unable to load native-hadoop library for your platform” error on CentOS, I have done the following:
(1) Check the hadoop
library:
bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
Yes, it is 64-bit
library.
(2) Try adding the HADOOP_OPTS
environment variable:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
It doesn't work, and reports the same error.
(3) Try adding the HADOOP_OPTS
and HADOOP_COMMON_LIB_NATIVE_DIR
environment variable:
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
It still doesn't work, and reports the same error.
Could anyone give some clues about the issue?
Summary: The above resolutions are very simple to resolve this error: “unable to load native Hadoop library for your platform… using builtin java classes where applicable”. It may cayuse version issue, for example, I installed Hadoop 2..x version on top of the Java 1.7 version.
Bookmark this question. Show activity on this post. I am using docker-spark. After starting spark-shell, it outputs: After referring Hadoop “Unable to load native-hadoop library for your platform” error on CentOS, I have done the following: Yes, it is 64-bit library.
If you are getting this warning while running Hadoop on windows or by running Spark using Eclipse or IntelliJ, you just need to ignore this warning and there is no solution to get rid of the warning as windows don’t have share libraries.
In order to resolve this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0 on 64bit system, then replace the 32bit one. You can check the version of Hadoop library by running below commands
Adding the Hadoop
library into LD_LIBRARY_PATH
fix this problem:
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With