Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

error when starting the spark shell

Tags:

apache-spark

I just downloaded the latest version of spark and when I started the spark shell I got the following error:

java.net.BindException: Failed to bind to: /192.168.1.254:0: Service 'sparkDriver' failed after 16 retries!
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)

...
...

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
...
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^

Is there something that I missed in setting up spark?

like image 610
JRR Avatar asked Jul 05 '15 06:07

JRR


People also ask

How do I start the spark shell command?

Launch Spark Shell (spark-shell) Command Go to the Apache Spark Installation directory from the command line and type bin/spark-shell and press enter, this launches Spark shell and gives you a scala prompt to interact with Spark in scala language.

How do I start spark in Python shell?

Launch PySpark Shell Command Go to the Spark Installation directory from the command line and type bin/pyspark and press enter, this launches pyspark shell and gives you a prompt to interact with Spark in Python language.

How do you initialize a spark?

Initializing Spark The first thing a Spark program must do is to create a SparkContext object, which tells Spark how to access a cluster. To create a SparkContext you first need to build a SparkConf object that contains information about your application. Only one SparkContext may be active per JVM.

What is spark shell command?

Spark Shell Commands are the command-line interfaces that are used to operate spark processing. Spark Shell commands are useful for processing ETL and Analytics through Machine Learning implementation on high volume datasets with very less time.


3 Answers

Try setting the Spark env variable SPARK_LOCAL_IP to a local IP address.

In my case, I was running Spark on an Amazon EC2 Linux instance. spark-shell stopped working, with an error message similar to yours. I was able to fix it by adding a setting like the following to the Spark config file spark-env.conf.

export SPARK_LOCAL_IP=172.30.43.105

Could also set it in ~/.profile or ~/.bashrc.

Also check host settings in /etc/hosts

like image 111
rake Avatar answered Oct 06 '22 02:10

rake


See SPARK-8162.

It looks like it only affects 1.4.1 and 1.5.0 - you're probably best off running the latest release (1.4.0 at time of writing).

like image 21
dpeacock Avatar answered Oct 06 '22 03:10

dpeacock


I was experiencing the same issue. First got to .bashrc and put

export SPARK_LOCAL_IP=172.30.43.105

then goto

cd $HADOOP_HOME/bin

then run the following command

hdfs dfsadmin -safemode leave

This just switches your safemode of namenode off.

Then delete the metastore_db folder from the spark home folder or /bin. It will be generally be in a folder from which you generally start a spark session.

then I ran my spark-shell using this

spark-shell --master "spark://localhost:7077"

and voila I didnot get the sqlContext.implicits._ error.

like image 40
Vijay Krishna Avatar answered Oct 06 '22 02:10

Vijay Krishna