Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop C++ HDFS test running Exception

I'm working with Hadoop 2.2.0 and trying to run this hdfs_test.cpp application:

#include "hdfs.h" 

int main(int argc, char **argv) {

    hdfsFS fs = hdfsConnect("default", 0);
    const char* writePath = "/tmp/testfile.txt";
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
    if(!writeFile) {
          fprintf(stderr, "Failed to open %s for writing!\n", writePath);
          exit(-1);
    }
    char* buffer = "Hello, World!";
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
    if (hdfsFlush(fs, writeFile)) {
           fprintf(stderr, "Failed to 'flush' %s\n", writePath); 
          exit(-1);
    }
   hdfsCloseFile(fs, writeFile);
}

I compiled it but when I'm running it with ./hdfs_test I have this:

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
Failed to open /tmp/testfile.txt for writing!

Maybe is a problem with the classpath. My $HADOOP_HOME is /usr/local/hadoop and this is my actually variable *CLASSPATH*:

echo $CLASSPATH
/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar

Any help is appreciated.. thanks

like image 649
user3077628 Avatar asked Jan 11 '14 15:01

user3077628


3 Answers

Try this:

hadoop classpath --glob

Then add the result to the CLASSPATH variable in ~/.bashrc

like image 80
Hamdi Charef Avatar answered Nov 19 '22 10:11

Hamdi Charef


I've faced problems with using wildcards in classpath when using JNI based programs. Try the direct-jar-in-classpath approach, such as the one generated in this sample code of mine at https://github.com/QwertyManiac/cdh4-libhdfs-example/blob/master/exec.sh#L3, and I believe it should instead work. The whole contained example at https://github.com/QwertyManiac/cdh4-libhdfs-example does work presently.

See also https://stackoverflow.com/a/9322747/1660002

like image 27
Harsh J Avatar answered Nov 19 '22 10:11

Harsh J


JNI won't take wildcard CLASSPATH. So, simply adding the results of hadoop classpath --glob won't work. The right way is:

export CLASSPATH=${HADOOP_HOME}/etc/hadoop:`find ${HADOOP_HOME}/share/hadoop/ | awk '{path=path":"$0}END{print path}'`
export LD_LIBRARY_PATH="${HADOOP_HOME}/lib/native":$LD_LIBRARY_PATH
like image 1
pgplus1628 Avatar answered Nov 19 '22 10:11

pgplus1628