Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiple Spark Workers on Single Windows Machine

I am trying to teach myself Spark through Scala using Intellij on Windows. I'm doing this on a single machine, and I would like to start multiple workers on the single machine to simulate a cluster. I read this page where it says that

"The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

I don't know what it means to start the masters and workers by hand. Could anyone help? Many thanks for any help/suggestions.

like image 966
Benji Kok Avatar asked Feb 07 '16 13:02

Benji Kok


1 Answers

To manually start Spark Master, run below command from %SPARK_HOME%\bin

spark-class org.apache.spark.deploy.master.Master

Above command will also print master URL like spark://ip:port
Master UI can be accessed at localhost:8080

To start Spark Worker, run

spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

Now if you refresh Master UI, you can see the new worker listed under Workers section.
Repeat the command to add multiple workers to the same master.

like image 144
Naresh Babu Avatar answered Sep 28 '22 21:09

Naresh Babu