Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why doesn't Stream#reduce implicitly accept an accumulative function handling super type elements?

Considering these classes and accumulation function, which represent a simplification of my original context (yet reproducing the same problem):

abstract static class Foo {
    abstract int getK();
}
static class Bar extends Foo {
    int k;
    Bar(int k) { this.k = k; }
    int getK() { return this.k; }
}

private static Foo combined(Foo a1, Foo a2) {
    return new Bar(a1.getK() + a2.getK());
}

I have attempted to perform an accumulation of items (originally data indexing reports) by relying on a separate function, combined, which deals directly with elements of type Foo.

Foo outcome = Stream.of(1,2,3,4,5)
        .map(Bar::new)
        .reduce((a,b) -> combined(a, b))
        .get();

It turns out that this code results in a compilation error (OpenJDK "1.8.0_92"): "Bad return type in lambda expression: Foo cannot be converted to Bar". The compiler insists on attempting to reduce the stream using Bar as the accumulative element, even when there is Foo as a common type for both the arguments to the cumulative function and its return type.

I also find peculiar that I can still take this approach as long as I explicitly map the stream into a stream of Foos:

Foo outcome = Stream.of(1,2,3,4,5)
        .<Foo>map(Bar::new)
        .reduce((a,b) -> combined(a, b))
        .get();

Is this a limitation of Java 8's generic type inference, a small issue with this particular overload of Stream#reduce, or an intentional behaviour that is backed by the Java specification? I have read a few other questions on SO where type inference has "failed", but this particular case is still a bit hard for me to grasp.

like image 391
E_net4 stands with Ukraine Avatar asked May 14 '16 16:05

E_net4 stands with Ukraine


People also ask

Why streaming is not working?

Streaming issues are often either a temporary condition with the streaming service or an issue with your network connection and internet. These options may help determine if your issue is related to your current ISP (internet service provider) and network conditions.

Why does my streaming stop and start?

Buffering refers to downloading a certain amount of data before starting to play the video. Two common reasons for buffering are 1) your internet connection is too slow to stream a video in real time, and 2) the speed at which your router sends the video to all your internet-connected devices is too slow.


2 Answers

The problem is that you're definitely creating a BinaryOperator<Foo> - you have to be, as you're returning a Foo. If you change combined() to be declared to return Bar (while still accepting Foo) then you'd be fine. It's the fact that the return type is tied to the input type that's the problem - it can't be either covariant or contravariant, because it's used for input and output.

To put it another way - you're expecting reduce((a, b) -> combined(a, b)) to return an Optional<Foo>, right? So that suggests that you're expecting the T of the reduce() call to be Foo - which means that it should be operating on a Stream<Foo>. A Stream<Bar> only has a single-parameter reduce method that takes a BinaryOperator<Bar>, and your lambda expression using combined simple isn't a BinaryOperator<Bar>.

Another alternative is to add a cast to the lambda expression:

Foo outcome = Stream.of(1,2,3,4,5)
    .map(Bar::new)                
    .reduce((a,b) -> (Bar)combined(a, b))
    .get();
like image 164
Jon Skeet Avatar answered Sep 22 '22 08:09

Jon Skeet


I think this is related to the reason why a list of Derived is not a list of Base. By doing .map(Bar::new) you create a stream of Bar. It is obviously not trivially convertible to stream of Foo according to the general rule “If B is an A, then X<B> is not X<A>”.

Then you're trying to reduce it, but reduce must create a stream of exactly the same type. What you want is reduce behaving like both reduce and map in the sense that you want it both reduce the stream to a single instance and change its type. This is more like a job for collect (except that collect works with mutable types).

But there is a variant of reduce that can change the type. And its signature actually gives us a hint of why the simple overload can't change the type of the stream:

<U> U reduce(U identity,
             BiFunction<U,? super T,U> accumulator,
             BinaryOperator<U> combiner)

It takes two functions! Why? Because Java streams are parallelizible. So a stream can perform reduction by parts and then combine parts into a single value. Here it is possible because we give an additional combiner. In your case one function is supposed to act as both accumulator and combiner which creates all the confusion about what exactly is its signature.

So the reason it doesn't work is because that overload lacks combiner that could combine partial results. Of course it isn't a “hard” reason in the sense that since Foo is a superclass of Bar, it is technically possible to use the same thing as both accumulator and combiner, like you do in your example with explicit <Foo>.

So it looks like a design decision that is intended to avoid possible confusion because of a stream randomly changing types after reduction. If you really want it, there is that another overload, but its signature is ugly enough to make the type change obvious just from looking at the code.

like image 44
Sergei Tachenov Avatar answered Sep 26 '22 08:09

Sergei Tachenov