Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to set Spark executor memory?

Tags:

apache-spark

I have set spark.executor.memory to 2048m, and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB. Very strange value. why not 256MB, or just as what I set?

What am I missing here?

like image 540
David S. Avatar asked Mar 16 '15 07:03

David S.


People also ask

How do I set executor cores in Spark?

Every Spark executor in an application has the same fixed number of cores and same fixed heap size. The number of cores can be specified with the --executor-cores flag when invoking spark-submit, spark-shell, and pyspark from the command line, or by setting the spark. executor. cores property in the spark-defaults.


2 Answers

The "Executors" tab on the UI also includes the driver in the list. Its "executor ID" is listed as <driver>. This process is not started by Spark, so it is not affected by spark.executor.memory.

  • If you start the driver with spark-submit, its maximal memory can be controlled by spark.driver.memory or --driver-memory
  • If you start it as a plain old Java program, use the usual -Xmx Java flag.
like image 54
Daniel Darabos Avatar answered Dec 04 '22 17:12

Daniel Darabos


Please see the following question for the 265.4MB memory size...

How to set Apache Spark Executor memory

like image 41
ᐅdevrimbaris Avatar answered Dec 04 '22 17:12

ᐅdevrimbaris