Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed [duplicate]

I have install below setup with version: Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1.

getting below error, can any one help me this.

root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.

17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing 


<console>:14: error: not found: value spark
       import spark.implicits._

<console>:14: error: not found: value spark
       import spark.sql


Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 
like image 571
Pankaj Kumar Avatar asked Jul 04 '17 21:07

Pankaj Kumar


4 Answers

There are a few different solutions

  1. Get your hostname

    $ hostname
    

    then try to assign your host name

    $ sudo hostname -s 127.0.0.1
    

    Start spark-shell.

  2. Add your hostname to your /etc/hosts file (if not present)

    127.0.0.1      your_hostname
    
  3. Add env variable

    export SPARK_LOCAL_IP="127.0.0.1" 
    
    load-spark-env.sh 
    
  4. Above steps solved my problem but you can also try to add

    export SPARK_LOCAL_IP=127.0.0.1 
    

    under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

    and then

    cp spark-env.sh.template spark-env.sh
    spark-shell
    
  5. If none of the above fixes, check your firewall and enable it, if not already enabled

like image 136
Alper t. Turker Avatar answered Nov 08 '22 22:11

Alper t. Turker


Add SPARK_LOCAL_IP in load-spark-env.sh as

export SPARK_LOCAL_IP="127.0.0.1"

The load-spark-env.sh file is located in spark/bin directory

Or you can add your hostname in /etc/hosts file as

127.0.0.1   hostname 

You can get your hostname by typing hostname in terminal

Hope this solves the issue!

like image 38
koiralo Avatar answered Nov 08 '22 23:11

koiralo


  • Had similar issue in my IntelliJ

    Reason : I was on cisco anyconnect VPN

    Fix : disconnected from the VPN, this issue did not appear

like image 10
sonu1986 Avatar answered Nov 08 '22 22:11

sonu1986


  1. in your terminal by typing hostname you can have a look at your current hostname.
  2. vim /etc/hosts and set the hostname you get just now to your exact ip or 127.0.0.1.
like image 2
linxx Avatar answered Nov 08 '22 22:11

linxx