Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Error: Java heap space

Tags:

hadoop

In Ubuntu, when I am running the hadoop example :

$bin/hadoop jar hadoop-examples-1.0.4.jar grep input output 'dfs[a-z.]+' 

$echo $HADOOP_HEAPSIZE
2000

In log, I am getting the error as :

INFO mapred.JobClient: Task Id : attempt_201303251213_0012_m_000000_2, Status : FAILED Error: Java heap space 13/03/25 15:03:43 INFO mapred.JobClient: Task Id :attempt_201303251213_0012_m_000001_2, Status : FAILED Error: Java heap space13/03/25 15:04:28 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201303251213_0012_m_000000 java.io.IOException: Job failed! at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265) at org.apache.hadoop.examples.Grep.run(Grep.java:69) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.examples.Grep.main(Grep.java:93)

Let us know what is the problem.

like image 568
Senthil Porunan Avatar asked Mar 25 '13 07:03

Senthil Porunan


People also ask

How do I fix Java heap space error?

OutOfMemoryError: Java heap space. 1) An easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options "-Xmx512M", this will immediately solve your OutOfMemoryError.

How do I free up heap space?

Once an object is not referenced by any other object, it can be cleared out of the heap, in order for the JVM to reclaim and reuse that space. The execution thread that is responsible to clear the heap space is the Garbage Collector.

What causes Java Lang OutOfMemoryError Java heap space?

lang. OutOfMemoryError exception. Usually, this error is thrown when there is insufficient space to allocate an object in the Java heap. In this case, The garbage collector cannot make space available to accommodate a new object, and the heap cannot be expanded further.

How do you fix heap memory problems?

There are several ways to eliminate a heap memory issue: Increase the maximum amount of heap available to the VM using the -Xmx VM argument. Use partitioning to distribute the data over additional machines. Overflow or expire the region data to reduce the heap memory footprint of the regions.


Video Answer


2 Answers

Clearly you have run out of the heap size allotted to Java. So you shall try to increase that.

For that you may execute the following before executing hadoop command:

export HADOOP_OPTS="-Xmx4096m"

Alternatively, you can achieve the same thing by adding the following permanent setting in your mapred-site.xml file, this file lies in HADOOP_HOME/conf/ :

<property>
    <name>mapred.child.java.opts</name>
    <value>-Xmx4096m</value>
</property>

This would set your java heap space to 4096 MB (4GB), you may even try it with a lower value first if that works. If that too doesn't work out then increase it more if your machine supports it, if not then move to a machine having more memory and try there. As heap space simply means you don't have enough RAM available for Java.

UPDATE: For Hadoop 2+, make the changes in mapreduce.map.java.opts instead.

like image 93
Amar Avatar answered Sep 21 '22 09:09

Amar


<property>
   <name>mapred.child.java.opts</name>
  <value>-Xmx4096m</value>
</property>

Works for me.

export HADOOP_OPTS="-Xmx4096m"

doesn't work

like image 38
JohnDavid Avatar answered Sep 19 '22 09:09

JohnDavid