I'm trying to understand how Spark 2.1.0 allocates memory on nodes.
Suppose I'm starting a local PySpark REPL assigning it 2GB of memory:
$ pyspark --conf spark.driver.memory=2g
Spark UI tells that there are 956.6 MB allocated for storage memory:
I don't understand how to get to that number, this is my thinking process:
2048 MB
,(2048 MB - 300 MB) * 0.6 = 1048.8 MB
are used for both execution and storage regions (unified),1048.8 MB * 0.5 = 524.4 MB
within unified region should be reserved as immune storage region
So, how was the value 956.6 MB in Spark actually calculated?
You seem to be using local
mode (with one driver that also acts as the only executor), but it should also be applicable to other clustered modes.
Enable the INFO logging level for BlockManagerMasterEndpoint
to know how much memory Spark sees the property you set on the command line (as spark.driver.memory
).
log4j.logger.org.apache.spark.storage.BlockManagerMasterEndpoint=INFO
When you start spark-shell --conf spark.driver.memory=2g
you'll see the following:
$ ./bin/spark-shell --conf spark.driver.memory=2g
...
17/05/07 15:20:50 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.8:57177 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.8, 57177, None)
As you can see the available memory is 912.3 which is calculated as follows (see UnifiedMemoryManager.getMaxMemory):
// local mode with --conf spark.driver.memory=2g
scala> sc.getConf.getSizeAsBytes("spark.driver.memory")
res0: Long = 2147483648
scala> val systemMemory = Runtime.getRuntime.maxMemory
// fixed amount of memory for non-storage, non-execution purposes
val reservedMemory = 300 * 1024 * 1024
// minimum system memory required
val minSystemMemory = (reservedMemory * 1.5).ceil.toLong
val usableMemory = systemMemory - reservedMemory
val memoryFraction = sc.getConf.getDouble("spark.memory.fraction", 0.6)
scala> val maxMemory = (usableMemory * memoryFraction).toLong
maxMemory: Long = 956615884
import org.apache.spark.network.util.JavaUtils
scala> JavaUtils.byteStringAsMb(maxMemory + "b")
res1: Long = 912
Let's review how web UI calculates the memory (which is different from what's above and is supposed to just display the value!). That's the surprising part.
How the Storage Memory is displayed in web UI is controlled by the custom JavaScript function formatBytes
in utils.js that (mapped to Scala) looks as follows:
def formatBytes(bytes: Double) = {
val k = 1000
val i = math.floor(math.log(bytes) / math.log(k))
val maxMemoryWebUI = bytes / math.pow(k, i)
f"$maxMemoryWebUI%1.1f"
}
scala> println(formatBytes(maxMemory))
956.6
956.6! That's exactly what web UI shows and is quite different from what Spark's UnifiedMemoryManager
considers the available memory. Quite surprising, isn't it?
I think it's a bug and filled it as SPARK-20691.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With