Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

UnsatisfiedLinkError (NativeIO$Windows.access0) when submitting mapreduce job to hadoop 2.2 from windows to ubuntu

I submit my mapreduce jobs from a java application running on windows to the hadoop 2.2 cluster running on ubuntu. In hadoop 1.x this worked as expected but on hadoop 2.2 I get a strange Error:

java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

I compiled the necesary windows libraries (hadoop.dll and winutils.exe) and can access the hdfs via code and read the cluster information using hadoop API. Only the job submission does not work.

Any help is aprecciated.

Solution: I found it out myself, the path where the windows hadoop binaries can be found has to be added to the PATH variable of windows.

like image 454
padmalcom Avatar asked Dec 14 '13 13:12

padmalcom


2 Answers

This error generally occurs due to the mismatch in your binary files in your %HADOOP_HOME%\bin folder. So, what you need to do is to get hadoop.dll and winutils.exe specifically for your hadoop version.

Get hadoop.dll and winutils.exe for your specific hadoop version and copy them to your %HADOOP_HOME%\bin folder.

like image 80
Vijay Avatar answered Nov 10 '22 12:11

Vijay


  1. Get hadoop.dll (or libhadoop.so on *x). Make sure to match bitness (32- vs. 64-bit) with your JVM.
  2. Make sure it is available via PATH or java.library.path.

    Note that setting java.library.path overrides PATH. If you set java.library.path, make sure it is correct and contains the hadoop library.

like image 21
rustyx Avatar answered Nov 10 '22 13:11

rustyx