Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Decimal order of addition affects results

Tags:

c#

math

decimal

I have a system that is performing lots of calculations using decimals, occasionally it will add up the same numbers, but return different results, +/- 0.000000000000000000000000001

Here is a short example:

decimal a = 2.016879990455473621256359079m;
decimal b = 0.8401819425625631128956517177m;
decimal c = 0.4507062854741283043456903406m;
decimal d = 6.7922317815078349615022988627m;

decimal result1 = a + b + c + d;
decimal result2 = a + d + c + b;

Console.WriteLine((result1 == result2) ? "Same" : "DIFFERENT");
Console.WriteLine(result1);
Console.WriteLine(result2);

That outputs:

DIFFERENT
10.100000000000000000000000000
10.100000000000000000000000001

The differences are so small that there is no practical effect, but has anyone seen something like this before? I expected that when adding up the same numbers you would always get the same results.

like image 915
BrandonAGr Avatar asked Jun 02 '11 23:06

BrandonAGr


2 Answers

The entire field of Numerical analysis is devoted to studying these kind of effects and how to avoid them.

To produce the best result when summing a list of floating point numbers, first sort the list from smallest to largest, and add them up in that order.

like image 162
Greg Hewgill Avatar answered Sep 25 '22 17:09

Greg Hewgill


You might suspect a decimal type to be immune to the bane of double-users everywhere.

But because decimal has 28-29 digits of precision and your input is asking for the sum of 29 digits of precision of data, you're right at the very edge of what your data type can accurately represent.

like image 28
sarnold Avatar answered Sep 23 '22 17:09

sarnold