Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Excel Vs C# Number Differences

Tags:

c#

excel

Can anyone shed light on why I might be seeing very small (10^-08) number differences when using exactly the same numbers in Excel vs C#??

I have a formula and use the same inputs. In Excel I get one number - In C# I get another. The difference is tiny.

I am using doubles in C# and doing division. I have tried using decimals which did not make much of a difference

EDIT: This is driving me NUTS - I ahve spent all morning on this - Any ideas??

like image 628
Jack Kada Avatar asked Feb 11 '10 11:02

Jack Kada


2 Answers

The difference is because of different MidPoint Rounding rules in c# and excel.

Try Math.Round(someNumber,precision,MidpointRounding.AwayFromZero);

like image 110
Manish Basantani Avatar answered Sep 30 '22 01:09

Manish Basantani


With such small magnitudes of difference (10^-08, you state), I suspect that intermediate calculations are causing the problem. Note that double-precision values are 64 bit, but the registers can work with 80 bits of precision. So, if you have a code sequence where the compiler will keep all your intermediate calculations on the registers, you will actually get better precision than if the same calculations are made across different points in your code, forcing the intermediate results to be held in 64 bit storage locations.

If you store values within Excel cells as part of your calculations, this, too, will result in truncating your intermediate calculations to 64 bits of precision.

You really need to show your code: show both (a) the Excel calculations (is it a worksheet formula or are you making programmatic assignments to the cell values?) and (b) the C# calculations that you are making. If you do that, then we should be able to help you more precisely. But with the information you've given so far, we can only make broad guesses.

-- Mike

like image 44
Mike Rosenblum Avatar answered Sep 30 '22 01:09

Mike Rosenblum