Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Java 'reduceLeft' signature / Lower-bounded Type Arguments

The following signature is valid and commonly used in Scala:

trait Collection[A] {
    def reduceLeft [B >: A] (f: (B, A) => B): B
}

However, since >: is the Scala equivalent of super in Java, my first idea to convert this signature (replacing the function type with BiFunction and making use of Use-Site variance annotations aka Bounded Wildcards) would be

interface Collection<A> {
    <B super A> B reduceLeft(BiFunction<? super B, ? super A, ? extends B> mapper)
}

But oh no! The compiler complains about the super token in <B super A> because you can't have lower-bounded type variables! Now how would I write this method in Java code without having to time-travel back to when generics didn't exist in the Java world?


Yes, I know that you think I could use B extends A, but that is not the same thing, as shown by my implementation:

public <R extends E> R reduceLeft(BiFunction<? super R, ? super E, ? extends R> mapper)
{
    if (this.isEmpty())
    {
        return null;
    }

    Iterator<E> iterator = this.iterator();
    R first = iterator.next(); // doesn't work, but would if R was a super-type of E (R super E)
    while (iterator.hasNext())
    {
        mapper.apply(first, iterator.next());
    }

    return first;
}

Instead, I had to use this slightly more restricted version:

public E reduceLeft(BiFunction<? super E, ? super E, ? extends E> mapper)
{
    if (this.isEmpty())
    {
        return null;
    }

    Iterator<E> iterator = this.iterator();
    E first = iterator.next();
    while (iterator.hasNext())
    {
        first = mapper.apply(first, iterator.next());
    }

    return first;
}
like image 519
Clashsoft Avatar asked Jul 14 '15 23:07

Clashsoft


3 Answers

The B >: A constraint in the Scala method definition is necessary because:

  1. Scala uses declaration-site variance and the immutable collections are covariant in the type of the elements they contain.
  2. reduceLeft needs to, conceptually, return values of type A, but using A as a return type means using it in a covariant position, which conflicts with the already declared variance, i.e., A must be covariant.

The trick to work around this variance conflict is to introduce that B generic type.

Now, as you've mentioned, Java employs use-site variance, so any collection written in Java will be invariant. This also means that there's no problem using A as the return type of a method, i.e., in contravariant position. So, the definition below should be enough — no need for a B type:

interface Collection<A> {
  A reduceLeft(BiFunction<? super A, ? super A, ? extends A> reducer);
}

However, as you can see, the net effect of having A once be a lower bound and then an upper bound is that A is basically invariant — it's impossible to benefit from the wildcard bounds without using downcasting. That means we can simplify the signature (which is pretty similar to Stream.reduce):

interface Collection<A> {
  A reduceLeft(BiFunction<A, A, A> reducer);
}

Also, the type BiFunction<A, A, A>, is already present in Java 8 under the name BinaryOperator<A>.

like image 65
Ionuț G. Stan Avatar answered Nov 17 '22 09:11

Ionuț G. Stan


You can't; Java doesn't consider this functionality useful enough. Maybe that will change now that Java is making greater use of higher-order functions.

I would emulate this functionality by accepting a function that "witnesses" that A <: B:

interface Collection<A> {
  static class Witness {
    static <B, A extends B> Function<A, B> witness() {
    return new Function<A, B> {
      public B apply(A value) {
        return value;
      }
    };
  }
  //Could take a custom type instead of Function if we want to enforce
  //that arbitrary functions aren't passed.
  //Could also just use foldLeft rather than reduceLeft
  <B> B reduceLeft(BinaryOperator<B, B, B> mapper, Function<A, B> witness);
}

Collection<Double> collectionOfDoubles = ...
BinaryOperator<Number, Number, Number> numberFunction = ...
//I haven't tested whether we need to pass explicit type arguments
collectionOfDoubles.reduceLeft(numberFunction, Collection.Witness.witness());

But we're starting down the road of greensupunning here.

Note that there is real value in the variance; with e.g. @Ionut's solution, the collectionOfDoubles.reduceLeft call is not possible as Double and Number are not the same type.

like image 27
lmm Avatar answered Nov 17 '22 09:11

lmm


A simple solution is to resort to static methods. Then you can declare both, the element type of the Collection and the return type of the reduction and thus, easily declare the relationship E extends R with it:

public static <R, E extends R> R reduceLeft(
    Collection<? extends E> c, BiFunction<? super R, ? super E, ? extends R> mapper) {
    if(c.isEmpty()) return null;
    Iterator<? extends E> iterator = c.iterator();
    R value = iterator.next();
    while(iterator.hasNext()) value=mapper.apply(value, iterator.next());
    return value;
}

an immediate advantage of this approach is that it works with all collections then, not only with the type in which you declare the method.

Note that you can also remove the relationship between the element type and result type completely if you request an appropriate identity value for the function, as also supported by the Stream API. This also fixes the flaw of your implementation of returning null for an empty collection:

public static <R, E> R reduceLeft(Collection<? extends E> c,
                      R identity, BiFunction<? super R, ? super E, ? extends R> mapper) {
    if(c.isEmpty()) return identity;
    R value=identity;
    for(E e: c) value=mapper.apply(value, e);
    return value;
}

Now that it doesn’t require a declaration of a relationship between E and R, this version could also be an instance method of your collection type:

public <R> R reduceLeft(R identity, BiFunction<? super R, ? super E, ? extends R> mapper) {
    if(isEmpty()) return identity;
    R value=identity;
    for(E e: this) value=mapper.apply(value, e);
    return value;
}

But if you don’t like the requirement to provide an identity value, you could also provide a function for the initial conversion of the first element, which is trivial for the case that R is a super-type of E:

public <R> R reduceLeft(Function<? super E, ? extends R> cast,
                        BiFunction<? super R, ? super E, ? extends R> mapper) {
    if(isEmpty()) return null;
    Iterator<E> it=iterator();
    R value=cast.apply(it.next());
    while(it.hasNext()) value=mapper.apply(value, it.next());
    return value;
}

So if R is a supertype of E, passing t->t or Function.identity() as cast parameter is sufficient. But it also allows more use cases where no relationship between R and E exists.

like image 29
Holger Avatar answered Nov 17 '22 08:11

Holger