Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Increase Spark memory when using local[*]

How do I increase Spark memory when using local[*]?

I tried setting the memory like this:

  val conf = new SparkConf()
    .set("spark.executor.memory", "1g")
    .set("spark.driver.memory", "4g")
    .setMaster("local[*]")
    .setAppName("MyApp")

But I still get:

MemoryStore: MemoryStore started with capacity 524.1 MB

Does this have something to do with:

.setMaster("local[*]")
like image 671
BAR Avatar asked Sep 21 '15 07:09

BAR


People also ask

How do I increase Spark memory?

To enlarge the Spark shuffle service memory size, modify SPARK_DAEMON_MEMORY in $SPARK_HOME/conf/spark-env.sh, the default value is 2g, and then restart shuffle to make the change take effect.

What is the maximum executor memory in Spark?

Now, talking about driver memory, the amount of memory that a driver requires depends upon the job to be executed. In Spark, the executor-memory flag controls the executor heap size (similarly for YARN and Slurm), the default value is 512MB per executor.


2 Answers

Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.

You can either launch your spark-shell using:

./bin/spark-shell --driver-memory 4g

or you can set it in spark-defaults.conf:

spark.driver.memory 4g

If you are launching an application using spark-submit, you must specify the driver memory as an argument:

./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
like image 190
Gillespie Avatar answered Oct 13 '22 16:10

Gillespie


I was able to solve this by running SBT with:

sbt -mem 4096

However the MemoryStore is half the size. Still looking into where this fraction is.

like image 37
BAR Avatar answered Oct 13 '22 18:10

BAR