Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Connection Refused When Running SparkPi Locally

Tags:

apache-spark

I'm trying to run a simple execution of the SparkPi example.  I started the master and one worker, then executed the job on my local "cluster", but end up getting a sequence of errors all ending with

Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: /127.0.0.1:39398

I originally tried running my master and worker without configuration but ended up with the same error.  I tried to change to 127.0.0.1 to test if it was maybe just a firewall issue since the server is locked down from the outside world.

My conf/spark-conf.sh contains the following:

export SPARK_MASTER_IP=127.0.0.1

Here is the order and commands I run:

1) sbin/start-master.sh (to start the master)

2) bin/spark-class org.apache.spark.deploy.worker.Worker spark://127.0.0.1:7077 --ip 127.0.0.1 --port 1111 (in a different session on the same machine to start the slave)

3) bin/run-example org.apache.spark.examples.SparkPi spark://127.0.0.1:7077 (in a different session on the same machine to start the job)

I find it hard to believe that I'm locked down enough that running locally would cause problems.

like image 902
Benny Avatar asked Mar 02 '14 15:03

Benny


1 Answers

It looks like you should not set SPARK_MASTER_IP to a loopback address 127.0.0.1. The worker node will not be able to connect to the MASTER node using a loopback address.

You shall set it to a valid local ip address(e.g., 192.168.0.2) in conf/spark-env.sh and add the worker's IP in conf/slaves configuration file in both MASTER and the WORKER node.

Then you can use sbin/start-all.sh to start the cluster.

And then run "bin/run-example org.apache.spark.examples.SparkPi"

like image 190
Qichang Chen Avatar answered Oct 13 '22 09:10

Qichang Chen