Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why shouldn't this compile?

Tags:

c#

enums

I stumbled upon this weird thing today:

http://www.yoda.arachsys.com/csharp/teasers.html

Question #5.

The code:

using System;

class Test
{
    enum Foo
    {
        Bar,
        Baz
    };

    const int One = 1;
    const int Une = 1;

    static void Main()
    {
        Foo f = One - Une;
        Console.WriteLine(f);
    }
}

Now according to the answers on http://www.yoda.arachsys.com/csharp/teasers-answers.html for question #5

... It's a known bug due to some optimisation being done too early, collecting constants of 0 and thinking that any known 0 constant should be convertible to the 0 value of any enum. It's with us now, and unlikely to ever be fixed as it could break some code which is technically illegal but working perfectly well. It's possible that the spec will change instead, of course.

But why?

One & Une are both const. I.e. they can be calculated compile-time, so it becomes Foo f = 0. And since 0 is a valid value for any enum, why shouldn't this compile?

like image 741
Snake Avatar asked Mar 25 '16 13:03

Snake


1 Answers

The problem is not that the compiler can or cannot make this program work. The problem is: What does the language spec demand to be done?

This behavior is a deviation form the spec so it's a compiler bug.

6.1.3 Implicit enumeration conversions An implicit enumeration conversion permits the decimal-integer-literal 0 to be converted to any enum-type and to any nullable-type whose underlying type is an enum-type.

So it must be a literal. 1-1 is not a literal zero. 0 is literal zero.

I wonder why the spec says "decimal". This means that hexadecimal-integer-literal is not included so 0x0 should not work either.

like image 68
usr Avatar answered Oct 11 '22 11:10

usr