I am trying to set up Apache Spark to run on Mesos, but I get the following message on terminal:
W0309 10:37:10.570291 4017 sched.cpp:700] Ignoring framework registered message because it was sent from '[email protected]:5050' instead of the leading master '[email protected]:5050'
This message keeps appearing on the spark-shell and I am not able to run any command. I started spark-shell using the command:
./bin/spark-shell --master mesos://127.0.0.1:5050 --conf spark.executor.uri=/home/user/spark/spark-1.6.0-bin-hadoop2.6.tgz
When I check the Framework tab on Mesos WebUI, Spark Shell is listed as a framework.
Any idea on why I faced the above message and cannot run commands from spark-shell? Or, any good reference to run Spark on Mesos?
I'll be doing a bit of quess work here but I'm assuming you did not specify an --ip
parameter when starting mesos-master.sh
. In such case you should change your startup script to:
./bin/spark-shell --master mesos://127.0.1.1:5050 --conf spark.executor.uri=/home/user/spark/spark-1.6.0-bin-hadoop2.6.tgz
I'm guessing you have a 127.0.1.1
entry in your /etc/hosts
(or whichever file is used for that resolution on your system) and Mesos is resolving to 127.0.1.1
by default. You can use the ip
parameter to change it to 127.0.0.1
if you prefer for some reason.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With