Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop always finishes with java.util.concurrent.TimeoutException

I am new to Hadoop and I have written a couple of jobs that always end with this no matter what:

    WARN util.ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException
java.util.concurrent.TimeoutException
    at java.util.concurrent.FutureTask.get(FutureTask.java:205)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:67)

I am running this on a Mac (Sierra 10.12.6) and I have installed Hadoop using homebrew.

This even happens when running the wordcount example from the hadoop docs.

Its odd because I don't get any errors while it is processing and I get the output I expect.

Just wondered if anyone has run into the same issue and if there is any extra configuration I am missing.

Thanks in advance!

like image 833
David Jones Avatar asked Sep 05 '17 20:09

David Jones


1 Answers

The default timeout for the Hadoop Job Shutdown hook is 30 seconds. You can change this timeout by configuring hadoop.service.shutdown.timeout in core-site.xml.

You can find the code related to this timeout in the ShutdownHookManager Class.

 static long getShutdownTimeout(Configuration conf) {
    long duration = conf.getTimeDuration("hadoop.service.shutdown.timeout", 30L, TIME_UNIT_DEFAULT);
    if (duration < 1L) {
      duration = 1L;
    }

    return duration;
  }
like image 136
Mukul Khatri Avatar answered Nov 07 '22 06:11

Mukul Khatri