As I understand, method filterKeys
of MapLike
creates a wrapper over the original map. So, if I execute the code below m
will be a chain of 10K wrappers and the original map.
var m = Map(1 -> "one", 2 -> "two") for(1 <- 0 until 10000) {m = m.filterKeys(_%2 == 0)}
Now I thought that an invocation of m.contains
caused a stack overflow but it does not happen. Could you explain what is going on in this case?
I could not reproduce the stack overflow, but here's what happen:
override def filterKeys(p: A => Boolean): Map[A, B] = new DefaultMap[A, B] {
override def foreach[C](f: ((A, B)) => C): Unit = for (kv <- self) if (p(kv._1)) f(kv)
def iterator = self.iterator.filter(kv => p(kv._1))
override def contains(key: A) = self.contains(key) && p(key)
def get(key: A) = if (!p(key)) None else self.get(key)
}
Note that there's no copying of values: the new class simply add rules to four methods that will change how they work to reflect the filter you added. Since you repeatedly apply filterKeys
(10000 times), that means you have 10000 classes, each pointing to the previous one, and the first pointing to your original map.
Calling one of the above methods (directly or indirectly) on the final class will, therefore, call 10000 nested methods, which can certainly produce a stack overflow if your stack is small enough.
I was able to cause it to Stack Overflow like this (Scala 2.9.1)
scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
huge stack trace
You can, of course, avoid the stack trace by forcing filterKeys
to actually do its work at each step:
scala> var m = Map(1 -> "one", 2 -> "two")
scala> for (_ <- 0 until 1000000) { m = Map() ++ m.filterKeys(_ % 2 == 0) }
scala> m.contains(1)
res1: Boolean = false
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With