Take a look at this question: Scala + Spark - Task not serializable: java.io.NotSerializableExceptionon. When calling function outside closure only on classes not objects.
Problem:
Suppose my mappers can be functions (def) that internally call other classes and create objects and do different things inside. (Or they can even be classes that extend (Foo) => Bar and do the processing in their apply method - but let'ś ignore this case for now)
Spark supports only Java Serialization for closures. Is there ANY way out of this? Can we use something instead of closures to do what I want to do? We can easily do this sort of stuff with Hadoop. This single thing is making Spark almost unusable for me. One cannot expect all 3rd party libraries to have all classes extend Serializable!
Probable Solutions:
Does something like this seem to be of any use: https://github.com/amplab/shark/blob/master/src/main/scala/shark/execution/serialization/KryoSerializationWrapper.scala
It certainly seems like a wrapper is the answer, but I cannot see exactly how.
I figured out how to do this myself!
You simply need to serialize the objects before passing through the closure, and de-serialize afterwards. This approach just works, even if your classes aren't Serializable, because it uses Kryo behind the scenes. All you need is some curry. ;)
Here's an example of how I did it:
def genMapper(kryoWrapper: KryoSerializationWrapper[(Foo => Bar)])
(foo: Foo) : Bar = {
kryoWrapper.value.apply(foo)
}
val mapper = genMapper(KryoSerializationWrapper(new Blah(abc))) _
rdd.flatMap(mapper).collectAsMap()
object Blah(abc: ABC) extends (Foo => Bar) {
def apply(foo: Foo) : Bar = { //This is the real function }
}
Feel free to make Blah as complicated as you want, class, companion object, nested classes, references to multiple 3rd party libs.
KryoSerializationWrapper referes to: https://github.com/amplab/shark/blob/master/src/main/scala/shark/execution/serialization/KryoSerializationWrapper.scala
In case of using Java API you should avoid anonymous class when passing to the mapping function closure. Instead of doing map( new Function) you need a class that extends your function and pass that to the map(..) See: https://yanago.wordpress.com/2015/03/21/apache-spark/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With