The example code below is from the book Advanced Analytics with Spark. When I load it into spark-shell (version 1.4.1) it gives the following error, indicating that it can't find StatCounter:
import org.apache.spark.util.StatCounter
<console>:9: error: not found: type StatCounter
val stats: StatCounter = new StatCounter()
^
<console>:9: error: not found: type StatCounter
val stats: StatCounter = new StatCounter()
^
<console>:23: error: not found: type NAStatCounter
def apply(x: Double) = new NAStatCounter().add(x)
If I just do the following in spark-shell there is no problem:
scala> import org.apache.spark.util.StatCounter
import org.apache.spark.util.StatCounter
scala> val statsCounter: StatCounter = new StatCounter()
statsCounter: org.apache.spark.util.StatCounter = (count: 0, mean: 0.000000, stdev: NaN, max: -Infinity, min: Infinity)
The problem seems to be with the :load command in spark-shell.
Here's the code:
import org.apache.spark.util.StatCounter
class NAStatCounter extends Serializable {
val stats: StatCounter = new StatCounter()
var missing: Long = 0
def add(x: Double): NAStatCounter = {
if (java.lang.Double.isNaN(x)) {
missing += 1
} else {
stats.merge(x)
}
this
}
def merge(other: NAStatCounter): NAStatCounter = {
stats.merge(other.stats)
missing += other.missing
this
}
override def toString = {
"stats: " + stats.toString + " NaN: " + missing
}
}
object NAStatCounter extends Serializable {
def apply(x: Double) = new NAStatCounter().add(x)
}
I have the exactly same problem with you.
I solve it as you tried,
CHANGE
val stats: StatCounter = new StatCounter()
INTO
val stats: org.apache.spark.util.StatCounter = new org.apache.spark.util.StatCounter()
the reason perhaps is the system don't know the path of StatCounter
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With