How do I increase Spark memory when using local[*]?
I tried setting the memory like this:
val conf = new SparkConf()
.set("spark.executor.memory", "1g")
.set("spark.driver.memory", "4g")
.setMaster("local[*]")
.setAppName("MyApp")
But I still get:
MemoryStore: MemoryStore started with capacity 524.1 MB
Does this have something to do with:
.setMaster("local[*]")
To enlarge the Spark shuffle service memory size, modify SPARK_DAEMON_MEMORY in $SPARK_HOME/conf/spark-env.sh, the default value is 2g, and then restart shuffle to make the change take effect.
Now, talking about driver memory, the amount of memory that a driver requires depends upon the job to be executed. In Spark, the executor-memory flag controls the executor heap size (similarly for YARN and Slurm), the default value is 512MB per executor.
Assuming that you are using the spark-shell.. setting the spark.driver.memory in your application isn't working because your driver process has already started with default memory.
You can either launch your spark-shell using:
./bin/spark-shell --driver-memory 4g
or you can set it in spark-defaults.conf:
spark.driver.memory 4g
If you are launching an application using spark-submit, you must specify the driver memory as an argument:
./bin/spark-submit --driver-memory 4g --class main.class yourApp.jar
I was able to solve this by running SBT with:
sbt -mem 4096
However the MemoryStore is half the size. Still looking into where this fraction is.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With