Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Decimal rounding errors upon division (C#)

I have basically four numbers (say 100, 200, 300, 400), and I need to calculate the probability as 100/(100+200+300+400), 200/(100+200+300+400), and so on.

When I use the decimal data type to store these probabilities, they don't up to one due to round issues. What's the best way to past this without making the probabilities too inaccurate? Basically I do this calculation many many times, so I don't want to have to change all the divisions into Math.Round stuff. :|

like image 950
Matt Avatar asked May 06 '11 11:05

Matt


1 Answers

The solution is straightforward: if it hurts when you do that then don't do that.

If you have rational probabilities, that is, probabilities that are ratios of whole numbers, and you want them to add to exactly one, then don't convert them to decimal or double in the first place. Use an arbitrary-precision rational type to represent your arbitrary precision rationals.

There's an arbitrary-precision rational type included with Microsoft Solver Foundation; you could download and use that. Or, it is easy to write your own by simply making an immutable struct that has two BigIntegers for the numerator and denominator, and then write implementations of the operators you need.

like image 83
Eric Lippert Avatar answered Oct 14 '22 15:10

Eric Lippert