Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Required executor memory is above the max threshold of this cluster

I am running Spark on an 8 node cluster with yarn as a resource manager. I have 64GB memory per node, and I set the executor memory to 25GB, but I get the error: Required executor memory (25600MB) is above the max threshold (16500 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'. I the set yarn.scheduler.maximum-allocation-mb and yarn.nodemanager.resource.memory-mb to 25600 but nothing changes. enter image description here

like image 890
user9332151 Avatar asked Nov 24 '25 21:11

user9332151


1 Answers

Executor memory is only the heap portion of the memory. You still have to run a JVM plus allocate the non-heap portion of memory inside a container and have that fit in YARN. Refer to the image from How-to: Tune Your Apache Spark Jobs (Part 2) by Sandy Ryza. enter image description here

If you want to use executor memory set at 25GB, I suggest you bump up yarn.scheduler.maximum-allocation-mb and yarn.nodemanager.resource.memory-mb to something higher like 42GB.

like image 96
tk421 Avatar answered Nov 27 '25 12:11

tk421



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!