Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark Clusters: worker info doesn't show on web UI

Tags:

apache-spark

I have installed spark standalone on a set of clusters. And I tried to launch clusters through the cluster launch script. I have added cluster's IP address into conf/slaves file. The master connects to all slaves through password-less ssh. After running ./bin/start-slaves.sh script, I get the following message:

starting org.apache.spark.deploy.worker.Worker, logging to /root/spark-0.8.0-incubating/bin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-jbosstest2.out

But the webUI of the master (localhost:8080) is not showing any information about the worker. But when I add localhost entry onto my conf/slaves file the worker info of localhost is shown.

There are no error messages, the message on terminal says the worker is started, but the WebUI is not showing any workers.

like image 297
Sudo Avatar asked Nov 08 '13 03:11

Sudo


2 Answers

I had the same problem. I noticed when I could not telnet master:port from the slaves. In my etc/hosts file (on master) I had a 127.0.0.1 master entry (before my 192.168.0.x master). When I removed the 127.0.0.1 entry from my etc/hosts file I could telnet and when I start-slaves.sh (from the master) my slaves connected

like image 83
davilj Avatar answered Oct 29 '22 09:10

davilj


When you run the cluster, check command $jps in worker nodes, whether its up correctly and check it in the logs with the worker's PID.

or

set the following: run the cluster and check if the ports are up or not with your configured ports

export SPARK_MASTER_WEBUI_PORT=5050
export SPARK_WORKER_WEBUI_PORT=4040
like image 27
ramisetty.vijay Avatar answered Oct 29 '22 08:10

ramisetty.vijay