I am having problems getting Spark Cassandra Connector working in Scala.
I'm using these versions:
I can connect and talk to Cassandra (w/o spark) and I can talk to Spark (w/o Cassandra) but the connector gives me:
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /10.0.0.194:9042 (com.datastax.driver.core.TransportException: [/10.0.0.194:9042] Cannot connect))
What am I missing? Cassandra is a default install (port 9042 for cql according to cassandra.yaml). I'm trying to connect locally ("local").
My code:
val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
val sc = new SparkContext("local","test",conf)
val rdd = sc.cassandraTable("myks","users")
val rr = rdd.first
println(s"Result: $rr")
Local in this context is specifying the Spark master (telling it to run in local mode) and not the Cassandra connection host.
To set the Cassandra Connection host you have to set a different property in the Spark Config
import org.apache.spark._
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "IP Cassandra Is Listening On")
.set("spark.cassandra.username", "cassandra") //Optional
.set("spark.cassandra.password", "cassandra") //Optional
val sc = new SparkContext("spark://Spark Master IP:7077", "test", conf)
https://github.com/datastax/spark-cassandra-connector/blob/master/doc/1_connecting.md
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With