For my specific case I want to use functional composition in a reduction; for example:
BiFunction<ImmutableSet<Integer>, ImmutableSet<Integer>, Sets.SetView<Integer>> f = Sets::intersection;
Function<Sets.SetView<Integer>, ImmutableSet<Integer>> g = Sets.SetView::immutableCopy;
BiFunction<ImmutableSet<Integer>, ImmutableSet<Integer>, ImmutableSet<Integer>> biFunction = f.andThen(g);
ImmutableSet<Integer> intersection = Stream.of(ImmutableSet.of(1, 2, 3), ImmutableSet.of(1, 2), ImmutableSet.of(4))
.reduce(biFunction)
.orElse(ImmutableSet.of());
This has a compilation error:
argument mismatch BiFunction cannot be converted to BinaryOperator
Instead, I need to do:
ImmutableSet<Integer> intersection = Stream.of(ImmutableSet.of(1, 2, 3), ImmutableSet.of(1, 2), ImmutableSet.of(4))
.reduce((a, b) -> Sets.intersection(a, b).immutableCopy())
.orElse(ImmutableSet.of());
However, this loses the point-free style that composition provides.
Why is the Stream API is designed like this? A BinaryOperator
is a BiFunction
, so wouldn't it make more sense to declare the reduce
method's parameter with the supertype?
The reduce operation must take arguments of the same type and return an identical type. If it didn't, there'd be a type mismatch. That's exactly what the BinaryOperator
is: BinaryOperator<T> extends BiFunction<T,T,T>
Instead of using a lambda, you can create your BiFunction
. Then create a BinaryOperator
:
import java.util.function.BinaryOperator;
import java.util.function.BiFunction;
import java.util.function.Function;
import java.util.stream.Stream;
import com.google.common.collect.ImmutableSet;
import com.google.common.collect.Sets;
public class StackOverflowTest {
public static void main(String[] args) {
BiFunction<ImmutableSet<Integer>, ImmutableSet<Integer>, Sets.SetView<Integer>> f = Sets::intersection;
Function<Sets.SetView<Integer>, ImmutableSet<Integer>> g = Sets.SetView::immutableCopy;
BiFunction<ImmutableSet<Integer>, ImmutableSet<Integer>, ImmutableSet<Integer>> biFunction = f.andThen(g);
BinaryOperator<ImmutableSet<Integer>> biOperator = biFunction::apply;
ImmutableSet<Integer> intersection =
Stream.of(ImmutableSet.of(1, 2, 3),
ImmutableSet.of(1, 2),
ImmutableSet.of(1, 4)) // added a 1
.reduce(biOperator)
.orElse(ImmutableSet.of());
System.out.println(intersection);
/*
prints:
[1]
*/
}
}
I don't think there is likely to be a satisfying answer to your actual question; the only benefit I can think of for taking the stricter type BinaryOperator<T>
as an argument is readability, but that may or may not have been on the minds of the API designers. We will probably never know what the rationale for the decision was unless one of the people who made the decision writes an answer.
That said, in your particular case it seems like the Sets.SetView::immutableCopy
function is unnecessary inside the reduction, because the Sets::intersection
function doesn't require its arguments to be of type ImmutableSet<E>
, so long as we specify that the stream values are the weaker type Set<E>
. So the following should be logically equivalent:
ImmutableSet<Integer> intersection =
Stream.<Set<Integer>>of(/* the sets */)
.reduce(Sets::intersection)
.map(ImmutableSet::copyOf)
.orElse(ImmutableSet.of());
There may be performance differences due to the fact that Sets::intersection
returns a view, without doing any copying. If the intersections are likely to be large relative to the original sets, and the number of sets is not large, then this version should be more efficient due to doing less memory allocation and copying. Otherwise if the intersections are likely to be small, or the number of sets is large, then the copying could be beneficial since it's faster to iterate over a smaller set than a view of the intersection of two large sets.
That said, in the second case I would recommend writing this in the imperative style with a for
loop, so you can stop early if the accumulator is already empty.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With