Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can't Double be implicitly cast to Decimal

Tags:

c#

types

casting

I don't understand the casting rules when it comes to decimal and double.

It is legal to do this

decimal dec = 10; double doub = (double) dec; 

What confuses me however is that decimal is a 16 byte datatype and double is 8 bytes so isn't casting a double to a decimal a widening conversation and should therefore be allowed implicitly; with the example above disallowed?

double doub = 3.2; decimal dec = doub; // CS0029: Cannot implicitly convert type 'double' to 'decimal' 
like image 916
Maxim Gershkovich Avatar asked Oct 19 '11 07:10

Maxim Gershkovich


2 Answers

If you convert from double to decimal, you can lose information - the number may be completely out of range, as the range of a double is much larger than the range of a decimal.

If you convert from decimal to double, you can lose information - for example, 0.1 is exactly representable in decimal but not in double, and decimal actually uses a lot more bits for precision than double does.

Implicit conversions shouldn't lose information (the conversion from long to double might, but that's a different argument). If you're going to lose information, you should have to tell the compiler that you're aware of that, via an explicit cast.

That's why there aren't implicit conversions either way.

like image 83
Jon Skeet Avatar answered Oct 13 '22 02:10

Jon Skeet


Decimal is more precise, so you would lose information. That's why you can only do it explicitely. It's to protect you from losing information. See MSDN

http://msdn.microsoft.com/en-us/library/678hzkk9%28v=VS.100%29.aspx

http://msdn.microsoft.com/en-us/library/364x0z75.aspx

like image 44
Pieter Avatar answered Oct 13 '22 02:10

Pieter