I have the following code:
var d = double.Parse("4796.400000000001");
Console.WriteLine(d.ToString("G17", CultureInfo.InvariantCulture));
If I compile and run this using an x86 configuration in Visual Studio, then I get the following output:
4796.4000000000005
If I instead compile as x64 I get this:
4796.4000000000015
I realize that using 32 och 64 bit compilation must affect how double values are handled by the system, but given that C# defines double as being 64 bit, shouldn't the result of this operation be the same independently of what compilation configuration I use?
Additional observation
Based on a comment regarding double.Parse I wrote this code:
var d0 = double.Parse("4796.400000000001");
double d1 = 4796.400000000001;
Console.WriteLine("d0: " + d0.ToString("G17", CultureInfo.InvariantCulture));
Console.WriteLine("d1: " + d1.ToString("G17", CultureInfo.InvariantCulture));
I get the following output when compiling as x86:
d0: 4796.4000000000005
d1: 4796.4000000000005
But I get this when I compile as x64:
d0: 4796.4000000000015
d1: 4796.4000000000005
Notice how the values differ in the x64 version, but not in the x86 version.
I think the simple answer to this one is that this is a bug in .NET Framework. I filed the following ticket on the issue:
https://developercommunity.visualstudio.com/content/problem/488302/issue-with-double-parser.html
The issue has been closed as "won't fix" with the following motivation:
The change taken in .NET Core to enable stability in these calculations was large and carried more risk that we typically take in .NET Framework.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With