Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Container killed by YARN for exceeding memory limits. 52.6 GB of 50 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead

Running spark job with 1 TB data with following configuration :

33G executor memory 40 executors 5 cores per executor

17 g memoryoverhead

What are the possible reasons for this Error?

like image 867
Renu Avatar asked Sep 27 '22 14:09

Renu


1 Answers

Where did you get that warning from? Which particular logs? Your lucky you even get a warning :). Indeed 17g seems like enough, but then you do have 1TB of data. I've had to use more like 30g for less data than that.

The reason for the error is that yarn uses extra memory for the container that doesn't live in the memory space of the executor. I've noticed that more tasks (partitions) means much more memory used, and shuffles are generally heavier, other than that I've not seen any other correspondences to what I do. Something somehow is eating memory unnecessarily.

It seems the world is moving to Mesos, maybe it doesn't have this problem. Even better, just use Spark stand alone.

More info: http://www.wdong.org/wordpress/blog/2015/01/08/spark-on-yarn-where-have-all-my-memory-gone/. This link seems kinda dead (it's a deep dive into the way YARN gobbles memory). This link may work: http://m.blog.csdn.net/article/details?id=50387104. If not try googling "spark on yarn where have all my memory gone"

like image 180
samthebest Avatar answered Sep 29 '22 08:09

samthebest