Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to check the number of cores Spark uses?

Tags:

apache-spark

I have spark.cores.max set to 24 [3 worker nodes], but If I get inside my worker node and see there is just one process [command = Java] running that consumes memory and CPU. I suspect it does not use all 8 cores (on m2.4x large).

How to know the number?

like image 927
user3279189 Avatar asked Oct 20 '22 01:10

user3279189


1 Answers

You can see the number of cores occupied on each worker in the cluster under the Spark Web UI: Spark Web UI

like image 116
Dan Osipov Avatar answered Oct 23 '22 22:10

Dan Osipov