Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Puzzled with Java8 Stream performance

I can't understand why iterating multiple time over a same array with Stream api result in such a performance!

see the code below.

public class WhyIsDifferent {

public static void main(String[] args) {

    int[] values = getArray();
    Iterate(values, 598, 600); // 70 ms
    Iterate(values, 200, 202); // 0 ms
    Iterate(values, 700, 702); // 0 ms
    Iterate(values, 300, 310); // 1 ms
}

public static void Iterate(int[] values, int from, int to) {
    long start = System.currentTimeMillis();
    IntStream.of(values).filter(i -> i < to && i > from)
    .forEach(i -> 
        System.out.println(i) // do a something
    );
    System.out.println("Time:" + (System.currentTimeMillis() - start));
}

public static int[] getArray() {
    int[] values = new int[1000];
    for (int i = 0; i < 1000; i++) {
        values[i] = i;
    }
    return values;
}
}

for sure JVM optimizes the code but i don't know how this happening?? It is amazing! do you have any clue why this happening?

--

I'm testing on Ubuntu 14.04/ / Oracle jdk / intel cpu.

like image 883
Morteza Adi Avatar asked Nov 24 '14 16:11

Morteza Adi


1 Answers

It is not the JIT compiler. Most of those 70 milliseconds are spent on the initialization of the whole lambda subsystem (the entry point to that logic is probably the LambdaMetaFactory class) and a good bit is also spent on the lambda bootstrap call (the linkage phase, as mentioned by user fge). Check out this method, the same as yours but with all the steps separately measured (and I use nanoTime):

public static void Iterate(int[] values, int from, int to) {
  long start = System.nanoTime();
  final IntPredicate predicate = i -> i < to && i > from;
  System.out.println("Predicate lambda creation time:" + NANOSECONDS.toMillis(System.nanoTime() - start));
  start = System.nanoTime();
  final IntConsumer action = System.out::println;
  System.out.println("Action lambda creation time:" + NANOSECONDS.toMillis(System.nanoTime() - start));
  start = System.nanoTime();
  final IntStream stream = IntStream.of(values).filter(predicate);
  System.out.println("Stream creation time:" + NANOSECONDS.toMillis(System.nanoTime() - start));
  start = System.nanoTime();
  stream.forEach(action);
  System.out.println("Stream consumption time:" + NANOSECONDS.toMillis(System.nanoTime() - start));
}

This is what was printed on my machine:

Predicate lambda creation time:53
Action lambda creation time:2
Stream creation time:2
599
Stream consumption time:1
Predicate lambda creation time:0
Action lambda creation time:0
Stream creation time:0
201
... all timings zero from here on...

You can see that the entire overhead of the first call is in the lambda creation part (which, on the first run only, includes general initialization and linkage) and stream creation also takes some time. Actual stream consumption takes zero time in all cases.

This effect is definitely something to keep on your mind with the current version of HotSpot: lambda bootstrap is an expensive thing.

Final note: if you reorder the lambda creation statements, you'll see that the majority of time stays with the first lambda to be created. That shows us that it is actually only the first overall creation of a lambda which bears most of the initialization cost.

like image 187
Marko Topolnik Avatar answered Oct 11 '22 13:10

Marko Topolnik