Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Not able to declare String type accumulator

I am trying to define an accumulator variable of type String in Scala shell (driver) but I keep getting the following error:-

scala> val myacc = sc.accumulator("Test")
<console>:21: error: could not find implicit value for parameter param: org.apache.spark.AccumulatorParam[String]
       val myacc = sc.accumulator("Test")
                                 ^

This seems to be no issue for Int or Double type of accumulator.

Thanks

like image 879
Dhiraj Avatar asked Jul 18 '15 23:07

Dhiraj


1 Answers

That's because Spark by default provides only accumulators of type Long, Double and Float. If you need something else you have to extend AccumulatorParam.

import org.apache.spark.AccumulatorParam

object StringAccumulatorParam extends AccumulatorParam[String] {

    def zero(initialValue: String): String = {
        ""
    }

    def addInPlace(s1: String, s2: String): String = {
        s"$s1 $s2"
    }
}

val stringAccum = sc.accumulator("")(StringAccumulatorParam)

val rdd = sc.parallelize("foo" :: "bar" :: Nil, 2)
rdd.foreach(s => stringAccum += s)
stringAccum.value

Note:

In general you should avoid using accumulators for tasks where data may grow significantly over time. Its behavior will similar to group an collect and in the worst case scenario can fail due to lack of resources. Accumulators are useful mostly for simple diagnostics tasks like keeping track of basic statistics.

like image 101
zero323 Avatar answered Nov 03 '22 18:11

zero323