I have a class with a decimal property & I'm serializing & deserializing using JSON.NET. The problem I'm having is that if say the decimal value is 100000000000023063.0 , when I deserialize it gets converted to 100000000000023000. I have checked the JSON & it's definitely recorded as 100000000000023063.0.
I've looked into it &
decimal.Parse("100000000000023063.0")
=
100000000000023063.0
but
var d = (decimal)100000000000023063.0
=
100000000000023000
I can get around this problem by storing as a string & have a property that does decimal.Parse() but does anyone know why this is happening?
It's because 100000000000023063.0
is a double
constant, which you're then converting to decimal
, and precision is getting lost.
Write it like this instead:
var d = 100000000000023063.0M;
The M
suffix tells the C# compiler that the number is a decimal
constant.
In fact you don't even need the decimal separator for this - i.e. 1M == 1.0M
and both are decimal
s.
It is because you are casting a double to a Decimal whereas you should be defining the value as a Decimal: i.e.
var q = 100000000000023063.0M;
The M suffix indicates that the numeric constant is of type Decimal
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With