I have spark.cores.max
set to 24
[3 worker nodes], but If I get inside my worker node and see there is just one process [command = Java] running that consumes memory and CPU. I suspect it does not use all 8 cores (on m2.4x large
).
How to know the number?
You can see the number of cores occupied on each worker in the cluster under the Spark Web UI:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With