spark-submit myApplication.py
spark.dynamicAllocation.enabled=True
and I can see that in my environment details.Executor memory: Also, the default executor memory is at 5120M
.
Executors: Next, I've got my executors tab, showing that I've got what looks like 3 active and 1 dead executor:
Another way to go to see how many resources are being used by each of the nodes of the cluster is to use the web tool of Ganglia.
This is published on the master node and will show a graph of each node's resource usage. The issue will be if you have not enable Ganglia at the time of cluster creation as one of the tools available on the EMR cluster.
Once enable however you can go to the web page and see how much each node is being utilized.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With