Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Java: Simple BigDecimal logical error

I've got a simple piece of code that isn't behaving as it should.

This piece of code is attempting to add an array of BigDecimals to then divide by array.length to find an average. However, the first phase of the algorithm fails to add the arrays together correctly (in the variable "sum").

public BigDecimal getAverageHeight()
{
    BigDecimal sum = new BigDecimal(0);
    BigDecimal[] heights = getAllHeights();

    for (int a = 0; a < heights.length; a++)
    {
        sum.add(heights[a]);
        System.out.println("Height[" + a + "] = " + heights[a]);
        System.out.println("Sum = " + sum.setScale(2, BigDecimal.ROUND_HALF_UP));
    }        

    return sum.divide(new BigDecimal(heights.length));
}

The output is as follows:

Height[0] = 24  
Sum = 0.00  
Height[1] = 24  
Sum = 0.00  
Height[2] = 24  
Sum = 0.00  
Height[3] = 26  
Sum = 0.00  
Height[4] = 26  
Sum = 0.00  
Height[5] = 26  
Sum = 0.00

I'm sure its a simple error, but I'm getting tired of starring at the problem, thanks in advance.

like image 798
cworner1 Avatar asked Jan 30 '26 18:01

cworner1


1 Answers

BigDecial.add() returns the sum, it does not alter it. Do this:

sum = sum.add(heights[a]);

Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!