When I set mapreduce.task.io.sort.mb = 100000. I get following exception.
java.lang.Exception: java.io.IOException: Invalid "mapreduce.task.io.sort.mb": 100000
What is the maximum value for mapreduce.task.io.sort.mb?
"mapreduce. task. io. sort. mb" is the total amount of buffer memory to use while sorting files, in megabytes.
xml. Lists the parameters for MapReduce configuration. MapReduce is a type of application that can run on the Hadoop 2. x framework.
I realize this question is old, but for those asking the same question you can check out some of the bugs around this value being capped
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_releasenotes_hdp_2.1/content/ch_relnotes-hdpch_relnotes-hdp-2.1.1-knownissues-mapreduce.html
BUG-12005: Mapreduce.task.io.sort.mb is capped at 2047.
Problem: mapreduce.task.io.sort.mb is hardcoded to not allow values larger than 2047. If you enter a value larger then this the map tasks will always crash at this line:
https://github.com/apache/hadoop-mapreduce/blob/HDFS-641/src/java/org/apache/hadoop/mapred/MapTask.java?source=cc#L746
hadoop-2.6.0 org.apache.hadoop.mapred.MapTask.java
line 427:we can't set mapreduce.task.io.sort.mb exceed 2047
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With