I am using "hadoop-0.20.203.0rc1.tar.gz" for my cluster setup.
Whenever I set job.setMapOutputKeyClass(ByteBuffer.class);
and run the job I get following Exception:
12/01/13 15:09:00 INFO mapred.JobClient: Task Id : attempt_201201131428_0005_m_000001_2, Status : FAILED
java.lang.ClassCastException: class java.nio.ByteBuffer
at java.lang.Class.asSubclass(Class.java:3018)
at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:776)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:958)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:673)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:755)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:369)
at org.apache.hadoop.mapred.Child$4.run(Child.java:259)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
at org.apache.hadoop.mapred.Child.main(Child.java:253)
Also i have noticed that ByteBuffer is Comparable and not Writable is that make any diffrence? Let me know if any additional information needed.
This is where the exception is being thrown. Here is the code from SVN.
public RawComparator getOutputKeyComparator() {
Class<? extends RawComparator> theClass = getClass("mapred.output.key.comparator.class",
null, RawComparator.class);
if (theClass != null)
return ReflectionUtils.newInstance(theClass, this);
return WritableComparator.get(getMapOutputKeyClass().asSubclass(WritableComparable.class));
}
If mapred.output.key.comparator.class
property is not defined on JobConf then the key must implement the WritableComparable interface. ByteBuffer class doesn't implement the WritableComparable interface, so the exception.
BTW, WritableComparable interface is sub-interface of Writable and Comparable classes.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With