I launched with the command
IPYTHON=1 MASTER=local[4] pyspark
Spark greets me with
Welcome to spark, version 1.2.1
SparkContext availabel as sc.
But using sc, I am not able to find the memory it has. How to find this out, and if possible how to set it to another value as well.
RM UI also displays the total memory per application. Spark UI - Checking the spark ui is not practical in our case. RM UI - Yarn UI seems to display the total memory consumption of spark app that has executors and driver.
Number of available executors = (total cores/num-cores-per-executor) = 150/5 = 30. Leaving 1 executor for ApplicationManager => --num-executors = 29. Number of executors per node = 30/10 = 3. Memory per executor = 64GB/3 = 21GB.
Execution memory - this memory is for storing data required during execution spark tasks; User memory - this memory is for user purposes. You can store here your custom data structure, UDFs, UDAFs, etc; Reserved memory - this memory is for spark purposes and it hardcoded to 300MB as of spark 1.6.
You can query the configuration of the SparkContext like so:
sc._conf.get('spark.executor.memory')
or, if you're interested in the driver's memory:
sc._conf.get('spark.driver.memory')
The complete configuration can be viewed as a list of tuples by
sc._conf.getAll()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With