Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Adding two .NET SqlDecimals increases precision?

in .NET, when I add two SqlDecimals, like so:

SqlDecimal s1 = new SqlDecimal(1);
SqlDecimal s2 = new SqlDecimal(1);
SqlDecimal s3 = s1 + s2;

then s3 has precision 2, whereas both s1 and s2 have precision 1.

This seems odd, especially as the documentation states that the return value of the addition operator is "A new SqlDecimal structure whose Value property contains the sum." I.e. according to the documentation, addition should not change the precision.

Am I missing something here? Is this intended behaviour?

Cheers,

Tilman

like image 566
user812775 Avatar asked Jun 23 '11 17:06

user812775


2 Answers

This article (http://msdn.microsoft.com/en-us/library/ms190476.aspx) explains the behavior for the SQL types, and I assume the .NET Sql data types reflect that in their behavior.

like image 103
Rick Liddle Avatar answered Oct 04 '22 22:10

Rick Liddle


I find this behavior normal.

Precision is defined as the maximum number of digits used to represent the Value property.

The sum of two SqlDecimal numbers who's Precision is equal to 1 is a number contained in [-10,10]. In which case the maximum Precision required is 2.

like image 31
InBetween Avatar answered Oct 04 '22 21:10

InBetween