in .NET, when I add two SqlDecimals, like so:
SqlDecimal s1 = new SqlDecimal(1);
SqlDecimal s2 = new SqlDecimal(1);
SqlDecimal s3 = s1 + s2;
then s3 has precision 2, whereas both s1 and s2 have precision 1.
This seems odd, especially as the documentation states that the return value of the addition operator is "A new SqlDecimal structure whose Value property contains the sum." I.e. according to the documentation, addition should not change the precision.
Am I missing something here? Is this intended behaviour?
Cheers,
Tilman
This article (http://msdn.microsoft.com/en-us/library/ms190476.aspx) explains the behavior for the SQL types, and I assume the .NET Sql data types reflect that in their behavior.
I find this behavior normal.
Precision
is defined as the maximum number of digits used to represent the Value
property.
The sum of two SqlDecimal
numbers who's Precision
is equal to 1
is a number contained in [-10,10]. In which case the maximum Precision
required is 2
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With