Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Connecting to a remote Spark master - Java / Scala

I created a 3 node (1 master, 2 workers) Apache Spark cluster in AWS. I'm able to submit jobs to the cluster from the master, however I cannot get it work remotely.

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/usr/local/spark/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application").setMaster("spark://ec2-54-245-111-320.compute-1.amazonaws.com:7077")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println(s"Lines with a: $numAs, Lines with b: $numBs")
    sc.stop()
  }
}

I can see from the master:

Spark Master at spark://ip-171-13-22-125.ec2.internal:7077
URL: spark://ip-171-13-22-125.ec2.internal:7077
REST URL: spark://ip-171-13-22-125.ec2.internal:6066 (cluster mode)

so when I execute SimpleApp.scala from my local machine, it fails to connect to the the Spark Master:

2017-02-04 19:59:44,074 INFO  [appclient-register-master-threadpool-0] client.StandaloneAppClient$ClientEndpoint (Logging.scala:54)  [] - Connecting to master spark://ec2-54-245-111-320.compute-1.amazonaws.com:7077...
2017-02-04 19:59:44,166 WARN  [appclient-register-master-threadpool-0] client.StandaloneAppClient$ClientEndpoint (Logging.scala:87)  [] - Failed to connect to spark://ec2-54-245-111-320.compute-1.amazonaws.com:7077
org.apache.spark.SparkException: Exception thrown in awaitResult
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) ~[spark-core_2.10-2.0.2.jar:2.0.2]
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) ~[spark-core_2.10-2.0.2.jar:2.0.2]
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) ~[scala-library-2.10.0.jar:?]
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) ~[spark-core_2.10-2.0.2.jar:2.0.2]

However, I know it would have worked if I had set the master to local, because then it would run locally. However, I want to have my client connecting to this remote master. How can I accomplish that? The Apache configuration looks file. I can even telnet to that public DNS and port, I also configured /etc/hosts with the public DNS and hostname for each of the EC2 instances. I want to be able to submit jobs to this remote master, what am I missing?

like image 237
cybertextron Avatar asked Feb 05 '17 04:02

cybertextron


1 Answers

For binding master host-name/IP go to your spark installation conf directory (spark-2.0.2-bin-hadoop2.7/conf) and create spark-env.sh file using below command.

cp spark-env.sh.template spark-env.sh

Open spark-env.sh file in vi editor and add below line with host-name/IP of your master.

SPARK_MASTER_HOST=ec2-54-245-111-320.compute-1.amazonaws.com

Stop and start Spark using stop-all.sh and start-all.sh. Now you can use it to connect remote master using

val spark = SparkSession.builder()
  .appName("SparkSample")
  .master("spark://ec2-54-245-111-320.compute-1.amazonaws.com:7077")
  .getOrCreate()

For more information on setting environment variables please check http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts

like image 75
abaghel Avatar answered Oct 13 '22 00:10

abaghel