Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Type mismatch in value from map: expected org.apache.hadoop.io.NullWritable, recieved org.apache.hadoop.io.Text

I am trying to tweak an existing problem to suit my needs..

Basically input is simple text I process it and pass key/value pair to reducer And I create a json.. so there is key but no value So mapper:

Input: Text/Text

Output: Text/Text

Reducer: Text/Text

Output: Text/None

My signatures are as follows:

public class AdvanceCounter {
/**
 * The map class of WordCount.
 */
public static class TokenCounterMapper
    extends Mapper<Object, Text, Text, Text> { // <--- See this signature

   public void map(Object key, Text value, Context context) // <--- See this signature
        throws IOException, InterruptedException {

     context.write(key,value); //both are of type text OUTPUT TO REDUCER
    }
}
   public static class TokenCounterReducer
    extends Reducer<Text, Text, Text, **NullWritable**> { // <--- See this signature Nullwritable here
    public void reduce(Text key, Iterable<Text> values, Context context) // <--- See this signature
        throws IOException, InterruptedException {


        for (Text value : values) {
            JSONObject jsn = new JSONObject();
            //String output = "";
            String[] vals = value.toString().split("\t");
            String[] targetNodes = vals[0].toString().split(",",-1);
            try {
                jsn.put("source",vals[1]);
                jsn.put("targets",targetNodes);
                context.write(new Text(jsn.toString()),null); // no value 
            } catch (JSONException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }

        }


    }
}
public static void main(String[] args) throws Exception {
    Configuration conf = new Configuration();
    String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
    Job job = new Job(conf, "Example Hadoop 0.20.1 WordCount");

    // ...
    //
    job.setOutputValueClass(NullWritable.class);
    FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
    FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
    System.exit(job.waitForCompletion(true) ? 0 : 1);
}

}

But on execution i am getting this error:

13/06/04 13:08:26 INFO mapred.JobClient: Task Id : attempt_201305241622_0053_m_000008_0, Status : FAILED
java.io.IOException: Type mismatch in value from map: expected org.apache.hadoop.io.NullWritable, recieved org.apache.hadoop.io.Text
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1019)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:691)
    at org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
    at org.sogou.Stinger$TokenCounterMapper.map(Stinger.java:72)
    at org.sogou.Stinger$TokenCounterMapper.map(Stinger.java:1)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
like image 739
frazman Avatar asked Jun 04 '13 20:06

frazman


1 Answers

You haven't specified your map output types, so it's taking the same as you set for your reducer, which are Text and NullWritable which is incorrect for your mapper. You should do the following to avoid any confusing it's better to specify all your types for both mapper and reducer:

job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(NullWritable.class);
like image 189
Charles Menguy Avatar answered Oct 24 '22 08:10

Charles Menguy