Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Setting "spark.memory.storageFraction" in Spark does not work

Tags:

apache-spark

I am trying to tune the memory parameter of Spark. I tried:

sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created

After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work.

What is the correct way to set "spark.memory.storageFraction"?

I am using Spark 2.0.

like image 816
derek Avatar asked Apr 20 '17 06:04

derek


People also ask

How memory management is done in spark?

Earlier memory management was done using StaticMemoryManager class, in spark 1.6 and above, memory management is done by UnifiedMemoryManager class. For the sake of understanding, we will take an example of 4GB Memory allocated to an executor and leave the default configuration and see how much memory each segment gets.

What percentage of spark memory is initial storage?

Initial Storage Memory (50% of spark memory) — 1423MB — 34.75% The percentage value is only for 4GB executor memory calculation, for a different executor memory configuration, these won’t hold good. It is only given for understanding purposes.

What is reserved memory in spark?

This memory stores sparks internal objects. One thing to note is that, if the executor memory is less than 1.5 times of reserved memory, Spark will fail with a “please use larger heap size” error message. 2. User Memory

What is the use of memory fraction in spark?

Hence, storage blocks can occupy parts of execution memory if it is free and vice-versa. The parameter spark.memory.fraction determines the total memory dedicated to Spark (for both shuffle and storage). The amount of storage memory which is protected from eviction is governed by spark.memory.storageFraction.


1 Answers

I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the size of storage region,actually it represent the maxMemory:

maxMemory =  (executorMemory - reservedMemory[default 384]) * memoryFraction[default 0.6]

check these for more detail ↓↓↓

spark ui executors-page source code

getMaxmemory source code

like image 175
FelixHo Avatar answered Oct 08 '22 13:10

FelixHo