Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do we need to set the output key/value class explicitly in the Hadoop program?

In the "Hadoop : The Definitive Guide" book, there is a sample program with the below code.

JobConf conf = new JobConf(MaxTemperature.class);  
conf.setJobName("Max temperature");  
FileInputFormat.addInputPath(conf, new Path(args[0]));  
FileOutputFormat.setOutputPath(conf, new Path(args[1]));  
conf.setMapperClass(MaxTemperatureMapper.class);  
conf.setReducerClass(MaxTemperatureReducer.class);  
conf.setOutputKeyClass(Text.class);  
conf.setOutputValueClass(IntWritable.class);  

The MR framework should be able to figure out the output key and value class from the Mapper and the Reduce functions which are being set on the JobConf class. Why do we need to explicitly set the output key and value class on the JobConf class? Also, there is no similar API for the input key/value pair.

like image 823
Praveen Sripati Avatar asked Sep 18 '11 11:09

Praveen Sripati


1 Answers

The reason is type erasure[1]. You set the output K/V classes as generics. During job setup (which is run time, not compile time), these generics are erased.

The input k/v classes can be read from the input file, in the case of SequenceFiles the classes are in the header- you can read them when opening a sequence file in the editor. This header must be written, since every map output is a SequenceFile, so you need to provide the classes.

[1] http://download.oracle.com/javase/tutorial/java/generics/erasure.html

like image 140
Thomas Jungblut Avatar answered Sep 24 '22 06:09

Thomas Jungblut