I have a simple method I've written for a class homework assignment that uses recursion (yes, it must use recursion) to calculate the number of triangles in a fractal pattern:
public static BigInteger triangleFract(int layer) {
if(layer < 0) {
throw new IllegalArgumentException("Input must be >= 0");
} else if(layer == 0) {
return new BigInteger("0");
} else if (layer == 1) {
return new BigInteger("1");
} else {
return triangleFract(layer - 1)
.multiply(new BigInteger("3"))
.add(new BigInteger("2"));
}
}
What I've been trying to do is understand how big the int layer can be so as to limit user input. After some tests I get a stack overflow at around 6700+, which is fine.
What is troubling me is that if layer is in the thousands, the method usually runs, but it can still randomly encounter a StackOverflowError
.
For instance, I chose to limit layer to 4444, and it seems to be able to handle that almost always, but every once in a while it still seems to overflow.
Why does it do this? And is there anything that I can do about it?
The most-common cause of stack overflow is excessively deep or infinite recursion, in which a function calls itself so many times that the space needed to store the variables and information associated with each call is more than can fit on the stack.
In order to prevent stack overflow bugs, you must have a base case where the function stops make new recursive calls. If there is no base case then the function calls will never stop and eventually a stack overflow will occur. Here is an example of a recursive function with a base case.
To prevent infinite recursion, you need at least one branch (i.e. of an if/else statement) that does not make a recursive call. Branches without recursive calls are called base cases; branches with recursive calls are called recursive cases. Functions can also be mutually recursive.
One method to prevent stack overflow is to track the stack pointer with test and measurement methods. Use timer interrupts that periodically check the location of the stack pointer, record the largest value, and watch that it does not grow beyond that value.
Perhaps the JVM has determined (through escape analysis) that the BigInteger can be allocated on the stack rather than the heap. Depending on when it implements this optimization, the required stack size would vary.
That said, there could be many other causes, and the behaviour is likely to depend on the JVM you use.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With