I'm trying to save a dataset in Cassandra database using spark scala, But I am getting an exception while running a code: link used:http://rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/
error:
could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFctory[FoodToUserIndex]
food_index.saveToCassandra("tutorial", "food_to_user_index")
^
.scala
def main(args: Array[String]): Unit = {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "localhost")
.set("spark.executor.memory", "1g")
.set("spark.cassandra.connection.native.port", "9042")
val sc = new SparkContext(conf)
case class FoodToUserIndex(food: String, user: String)
val user_table = sc.cassandraTable[CassandraRow]("tutorial", "user").select("favorite_food","name")
val food_index = user_table.map(r => new FoodToUserIndex(r.getString("favorite_food"), r.getString("name")))
food_index.saveToCassandra("tutorial", "food_to_user_index")}
build.sbt
name := "intro_to_spark"
version := "1.0"
scalaVersion := "2.11.2"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0-rc3"
if change the version of scala and cassandra connector to 2.10, 1.1.0 it's work. but i need use scala 2.11:
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc()
Moving case class FoodToUserIndex(food: String, user: String)
outside main function should solve the problem.
It has to do with "datastax spark-cassandra-connector" version and not Scala version.
So far, Version 1.2.x missing saving from custom class.
Try "datastax spark-cassandra-connector" version 1.1.1 with Scala 2.11 and it should work
Note: Make sure to have Spark compiled against Scala 2.11 too.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With