Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to solve "Can't assign requested address: Service 'sparkDriver' failed after 16 retries" when running spark code?

I am learning spark + scala with intelliJ , started with below small piece of code

import org.apache.spark.{SparkConf, SparkContext}

object ActionsTransformations {

  def main(args: Array[String]): Unit = {
    //Create a SparkContext to initialize Spark
    val conf = new SparkConf()
    conf.setMaster("local")
    conf.setAppName("Word Count")
    val sc = new SparkContext(conf)

    val numbersList = sc.parallelize(1.to(10000).toList)

    println(numbersList)
  }

}

when trying to run , getting below exception

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

Process finished with exit code 1

can any one suggest what to do .

like image 979
Bravo Avatar asked Sep 02 '18 03:09

Bravo


People also ask

What is the exception 'sparkdriver failed after 16 retries'?

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.

What are the most common service 'sparkdriver' errors?

One of the most common errors I personally see is the Service 'sparkDriver' failed after 16 retries one. The full traceback of the error is shared below. Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)!

How to deal with the 'sparkdriver bind address is wrong' error?

Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. In today’s short tutorial we will explore a few potential workarounds that could eventually help you deal with this error.

Why can’t I connect to the spark database?

This means that spark cannot find the necessary jar driver to connect to the database. We need to provide our application with the correct jars either in the spark configuration when instantiating the session or as a command line argument — depending on how we run our application.


2 Answers

Seems like you've used some old version of spark. In your case try to add this line:

conf.set("spark.driver.bindAddress", "127.0.0.1")

If you will use spark 2.0+ folowing should do the trick:

val spark: SparkSession = SparkSession.builder()
.appName("Word Count")
.master("local[*]")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()
like image 76
Jack Avatar answered Sep 17 '22 11:09

Jack


Add SPARK_LOCAL_IP in load-spark-env.sh file located at spark/bin directory

export SPARK_LOCAL_IP="127.0.0.1"

like image 33
Eldho Shaji Avatar answered Sep 18 '22 11:09

Eldho Shaji