I have the following piece of code in Spark:
rdd
.map(processFunction(_))
.saveToCassandra("keyspace", "tableName")
Where
def processFunction(src: String): Seq[Any] =
src match {
case "a" => List(A("a", 123112, "b"), A("b", 142342, "c"))
case "b" => List(B("d", 12312, "e", "f"), B("g", 12312, "h", "i"))
}
Where:
case class A(entity: String, time: Long, value: String)
case class B(entity: String, time: Long, value1: String, value2: String)
saveToCassandra
expects a collection of objects and using Seq[Any]
as the return type to contain both Seq[A]
and Seq[B]
breaks saveToCassandra
with the exception - scala.ScalaReflectionException: <none>
is not a term. What could be the reason for this behaviour?
def processFunction(src: String): (Any, Any) = {
src match {
case "a" => (A("a", 123112, "b"), A("b", 142342, "c"))
case "b" => (B("d", 12312, "e", "f"), B("g", 12312, "h", "i"))
}
}
Something like that may work. I haven't played around too much with saving objects in cassandra though. Nor using any with cassandra. However, the above solution without the case classes and anys is how I solved such an issue recently. For instance, the below would work.
def processFunction(src: String): (String, Int, String) = {
src match {
case "a" => ("a", 123112, "b")
case "b" => ("d", 12312, "e")
}
}
However, that is not exactly what you want. So yeah, take it for what you will.
I faced this issue and applying case class on this will help you resolve this issue.
Below is the example.
case class test_row(col1: String,
col2: String,
col3: String)
And apply this case class on a df/rdd.
df.map { x => test_row.apply(x.get(0).asInstanceOf[String], x.get(1).asInstanceOf[String],x.get(2).asInstanceOf[String])
}.rdd.saveToCassandra
This resolved 'none' is not a term issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With