Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does an explicit cast to ‘decimal’ call an explicit operator to ‘long’?

Consider the following code:

class Program
{
    public static explicit operator long(Program x) { return 47; }

    static int Main(string[] args)
    {
        var x = new Program();
        Console.WriteLine((decimal) x);
    }
}

To my surprise, this outputs 47; in other words, the explicit operator long is called even though the cast is to decimal.

Is there something in the C# spec that explicitly says that this should happen (if so, where exactly) or is this the result of some other rule(s) I’m missing?

like image 221
Timwi Avatar asked Nov 02 '11 14:11

Timwi


People also ask

Who is doing implicit conversion and explicit conversion?

The compiler provides implicit type conversions when operands are of different data types. It is automatically done by the compiler by converting smaller data type into a larger data type.

What is an implicit operator?

The Implicit Operator According to MSDN, an implicit keyword is used to declare an implicit user-defined type conversion operator. In other words, this gives the power to your C# class, which can accepts any reasonably convertible data type without type casting.

What is the implicit conversion?

An implicit conversion sequence is the sequence of conversions required to convert an argument in a function call to the type of the corresponding parameter in a function declaration. The compiler tries to determine an implicit conversion sequence for each argument.


1 Answers

The only explanation I can think of is that the compiler is smart enough to realize there is an implicit operator that will convert long to decimal, that it can use to satisfy the explicit conversion between Program and decimal when Program can only convert to long.

EDIT: Here we are; conversions between numeric types are built into the language spec:

6.1.2 Implicit numeric conversions

The implicit numeric conversions are:

· From sbyte to short, int, long, float, double, or decimal.

· From byte to short, ushort, int, uint, long, ulong, float, double, or decimal.

· From short to int, long, float, double, or decimal.

· From ushort to int, uint, long, ulong, float, double, or decimal.

· From int to long, float, double, or decimal.

· From uint to long, ulong, float, double, or decimal.

· From long to float, double, or decimal.

· From ulong to float, double, or decimal.

· From char to ushort, int, uint, long, ulong, float, double, or decimal.

· From float to double.

Conversions from int, uint, long, or ulong to float and from long or ulong to double may cause a loss of precision, but will never cause a loss of magnitude. The other implicit numeric conversions never lose any information.

There are no implicit conversions to the char type, so values of the other integral types do not automatically convert to the char type.

So, when converting between Program and decimal, C# knows that it can implicitly convert from any numeric type to decimal, so when performing this explicit conversion, it will look for any operator that can get Program to a numeric type.

What would be interesting to see is, what happens if you also put in an explicit conversion to, say, uint, that returned 48? Which one would the compiler pick?

like image 57
KeithS Avatar answered Oct 04 '22 16:10

KeithS