In Apache spark UI One of metrics at stage level is "Peak Execution Memory"? What this metrics indicating.
Peak Execution memory
refers to the memory used by internal data structures created during shuffles, aggregations and joins. The value of this accumulator should be approximately the sum of the peak sizes across all such data structures created in this task. For SQL jobs, this only tracks all unsafe operators, broadcast joins, and external sort.
You can get more insights about the same using Tracking spark's memory usage
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With