Running Spark Application with local mode, I used the command, such as:
spark-submit --master local[*] my_spark_application.py
In this case, dose which mean that my application used all memory of my local computer? Are the other parameters, like driver-memory and executor-memory still working?
Local Mode is also known as Spark in-process is the default mode of spark. It does not require any resource manager. It runs everything on the same machine. Because of local mode, we are able to simply download spark and run without having to install any resource manager.
Setting driver memory is the only way to increase memory in a local spark application.
"Since you are running Spark in local mode, setting spark.executor.memory
won't have any effect, as you have noticed. The reason for this is that the Worker "lives" within the driver JVM process that you start when you start spark-shell and the default memory used for that is 512M. You can increase that by setting spark.driver.memory
to something higher, for example 5g" from How to set Apache Spark Executor memory
It depends on which virtual environment tool is used , if you install just spark without virtual environment (like docker) it takes your full your local memory , So , I recommend to use spark inside docker container which it takes about 220MB (default)
First install docker ;
then , Create container ;
install spark into container .
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With