Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Java 8 streams - stackoverflow exception

Running the following code sample ends with:
"Exception in thread "main" java.lang.StackOverflowError"

import java.util.stream.IntStream;
import java.util.stream.Stream;

public class TestStream {

    public static void main(String[] args) {
        Stream<String> reducedStream = IntStream.range(0, 15000)
            .mapToObj(Abc::new)
            .reduce(
                Stream.of("Test")
                , (str , abc) -> abc.process(str)
                , (a , b) -> {throw new IllegalStateException();}
        );
        System.out.println(reducedStream.findFirst().get());
    }

    private static class Abc { 
        public Abc(int id) {
        }

        public Stream<String> process(Stream<String> batch) {
            return batch.map(this::doNothing);
        }

        private String doNothing(String test) {
            return test;
        }
    }
}

What exactly is causing that issue? Which part of this code is recursive and why?

like image 312
slowikps Avatar asked Apr 15 '16 20:04

slowikps


1 Answers

Your code isn't recursively looping. You can test with smaller numbers for the IntStream range (i.e. 1 or 100). In your case it's the actual stack size limit that causes the problem. As pointed out in some of the comments, its the way the streams are processes.

Each invocation on the stream creates a new wrapper stream around the original one. The 'findFirst()' method asks the previous stream for elements, which in turn asks the previous stream for elements. As the streams are no real containers but only pointers on the elements of the result.

The wrapper explosion happens in the reduce methods' accumulator '(str , abc) -> abc.process(str)'. The implementation of the method creates a new stream wrapper on the result (str) of the previous operation, feeding into the next iteration, creating a new wrapper on the result(result(str))). So the accumulation mechanism is one of a wrapper (recursion) and not of an appender (iteration). So creating a new stream of the actual (flattened) result and not on reference to the potential result would stop the explosion, i.e.

public Stream<String> process(Stream<String> batch) {
        return Stream.of(batch.map(this::doNothing).collect(Collectors.joining()));
    }

This method is just an example, as your original example doesn't make any sense because it does nothing, and neither does this example. Its just an illustration. It basically flattens the elements of the stream returned by the map method into a single string and creates a new stream on this concrete string and not on a stream itself, thats the difference to your original code.

You could tune the stacksize using the '-Xss' parameter which defines the size of the stack per thread. The default value depends on the platform, see also this question 'What is the maximum depth of the java call stack?' But take care when increasing, this setting applies to all threads.

like image 131
Gerald Mücke Avatar answered Sep 30 '22 19:09

Gerald Mücke