I have set spark.executor.memory to 2048m, and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB. Very strange value. why not 256MB, or just as what I set?
What am I missing here?
Every Spark executor in an application has the same fixed number of cores and same fixed heap size. The number of cores can be specified with the --executor-cores flag when invoking spark-submit, spark-shell, and pyspark from the command line, or by setting the spark. executor. cores property in the spark-defaults.
The "Executors" tab on the UI also includes the driver in the list. Its "executor ID" is listed as <driver>
. This process is not started by Spark, so it is not affected by spark.executor.memory
.
spark-submit
, its maximal memory can be controlled by spark.driver.memory
or --driver-memory
-Xmx
Java flag.Please see the following question for the 265.4MB memory size...
How to set Apache Spark Executor memory
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With