Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

SparkDeploySchedulerBackend Error: Application has been killed. All masters are unresponsive

Tags:

apache-spark

While I'm starting Spark shell:

bin>./spark-shell

I get the following error :

Spark assembly has been built with Hive, including Data nucleus jars on classpath
Welcome to SPARK VERSION 1.3.0
Using Scala version 2.10.4 (Java HotSpot(TM) Server VM, Java 1.7.0_75)
Type in expressions to have them evaluated.
Type :help for more information.
15/05/10 12:12:21 ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
15/05/10 12:12:21 ERROR TaskSchedulerImpl: Exiting due to error from cluster scheduler: All masters are unresponsive! Giving up.

I have installed spark by follow below link :- http://www.philchen.com/2015/02/16/how-to-install-apache-spark-and-cassandra-stack-on-ubuntu

like image 709
Manish Agrawal Avatar asked Mar 16 '23 02:03

Manish Agrawal


2 Answers

You should supply your Spark Cluster's Master URL when start a spark-shell

At least:

bin/spark-shell --master spark://master-ip:7077

All the options make up a long list and you can find the suitable ones yourself:

bin/spark-shell --help
like image 161
yjshen Avatar answered Mar 19 '23 13:03

yjshen


I am assuming that you are running this in standalone/local mode. Run your spark shell with following line. That indicates you are using all the available cores of your master which is local machine.

bin/spark-shell --master local[*]

http://spark.apache.org/docs/1.2.1/submitting-applications.html#master-urls

like image 35
user931174 Avatar answered Mar 19 '23 13:03

user931174