I am new to Hadoop and I have written a couple of jobs that always end with this no matter what:
WARN util.ShutdownHookManager: ShutdownHook '' timeout, java.util.concurrent.TimeoutException
java.util.concurrent.TimeoutException
at java.util.concurrent.FutureTask.get(FutureTask.java:205)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:67)
I am running this on a Mac (Sierra 10.12.6) and I have installed Hadoop using homebrew.
This even happens when running the wordcount example from the hadoop docs.
Its odd because I don't get any errors while it is processing and I get the output I expect.
Just wondered if anyone has run into the same issue and if there is any extra configuration I am missing.
Thanks in advance!
The default timeout for the Hadoop Job Shutdown hook is 30 seconds. You can change this timeout by configuring hadoop.service.shutdown.timeout
in core-site.xml
.
You can find the code related to this timeout in the ShutdownHookManager
Class.
static long getShutdownTimeout(Configuration conf) {
long duration = conf.getTimeDuration("hadoop.service.shutdown.timeout", 30L, TIME_UNIT_DEFAULT);
if (duration < 1L) {
duration = 1L;
}
return duration;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With