Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running spark application in local mode

Tags:

apache-spark

I'm trying to start my Spark application in local mode using spark-submit. I am using Spark 2.0.2, Hadoop 2.6 & Scala 2.11.8 on Windows. The application runs fine from within my IDE (IntelliJ), and I can also start it on a cluster with actual, physical executors.

The command I'm running is

spark-submit --class [MyClassName] --master local[*] target/[MyApp]-jar-with-dependencies.jar [Params]

Spark starts up as usual, but then terminates with

java.io.Exception: Failed to connect to /192.168.88.1:56370

What am I missing here?

like image 963
DNR Avatar asked Oct 29 '22 12:10

DNR


1 Answers

Check which port you are using: if on cluster: log in to master node and include:

--master spark://XXXX:7077

You can find it always in spark ui under port 8080

Also check your spark builder config if you have set master already as it takes priority when launching eg:

val spark = SparkSession
  .builder
  .appName("myapp")
  .master("local[*]")  
like image 150
elcomendante Avatar answered Nov 15 '22 10:11

elcomendante