I typed the following into the Scala-REPL:
scala> List(1, 2, 3).toSet.subsets(2).map(_.toList)
res0: Iterator[List[Int]] = non-empty iterator
scala> List(1, 2, 3).toSet.subsets.map(_.toList)
<console>:8: error: missing parameter type for expanded function ((x$1) => x$1.toList)
List(1, 2, 3).toSet.subsets.map(_.toList)
Why am I getting an error for the second line? Is this a bug in the compiler or am I missing something?
Paradoxically, the first version works because subsets
in the application subsets(2)
is, as it were, more ambiguous than without parens.
Because the method is overloaded, in the application, the compiler pauses to solve for the B
result of toSet
, and decides that B
is Int
. So it knows what param type is expected for the map
.
In the version without parens, the method with a parameter list is not a candidate, because eta-expansion is not triggered. So when it types the map
application, it has not drawn any conclusions about B
, which is the input type to the mapping function.
The easy fix is to tell it to deduce B
:
trait Test {
def f1 = List(1, 2, 3).to[Set].subsets.map(_.toList) // instead of .toSet
def f2 = List(1, 2, 3).toSet.subsets(2).map(_.toList)
}
The output of -Ytyper-debug
on the original code shows how overload resolution gooses type inference:
| | | | | | \-> => Iterator[scala.collection.immutable.Set[B]] <and> (len: Int)Iterator[scala.collection.immutable.Set[B]]
| | | | | solving for (B: ?B)
| | | | | |-- 2 : pt=Int BYVALmode-EXPRmode-POLYmode (silent: method f2 in Test)
| | | | | | \-> Int(2)
| | | | | solving for (B: ?B)
| | | | | \-> Iterator[scala.collection.immutable.Set[Int]]
Another workaround is to go through an extension method:
scala> implicit class ss[A](val s: Set[A]) { def ss(n: Int) = s subsets n ; def ss = s.subsets }
defined class ss
scala> List(1, 2, 3).toSet.ss.map(_.toList)
res1: Iterator[List[Int]] = non-empty iterator
Let's see if they'll take the library change:
https://github.com/scala/scala/pull/4270
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With