Can anyone shed light on why I might be seeing very small (10^-08) number differences when using exactly the same numbers in Excel vs C#??
I have a formula and use the same inputs. In Excel I get one number - In C# I get another. The difference is tiny.
I am using doubles in C# and doing division. I have tried using decimals which did not make much of a difference
EDIT: This is driving me NUTS - I ahve spent all morning on this - Any ideas??
The difference is because of different MidPoint Rounding rules in c# and excel.
Try Math.Round(someNumber,precision,MidpointRounding.AwayFromZero);
With such small magnitudes of difference (10^-08, you state), I suspect that intermediate calculations are causing the problem. Note that double-precision values are 64 bit, but the registers can work with 80 bits of precision. So, if you have a code sequence where the compiler will keep all your intermediate calculations on the registers, you will actually get better precision than if the same calculations are made across different points in your code, forcing the intermediate results to be held in 64 bit storage locations.
If you store values within Excel cells as part of your calculations, this, too, will result in truncating your intermediate calculations to 64 bits of precision.
You really need to show your code: show both (a) the Excel calculations (is it a worksheet formula or are you making programmatic assignments to the cell values?) and (b) the C# calculations that you are making. If you do that, then we should be able to help you more precisely. But with the information you've given so far, we can only make broad guesses.
-- Mike
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With