Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can't I declare an enum inheriting from Byte but I can from byte?

If I declare an enum like this ...

public enum MyEnum : byte {
    Val1,
    Val2
}

... it's working.

If I declare an enum like this ...

public enum MyEnum : System.Byte {
    Val1,
    Val2
}

... it's not working. The compiler throw :

error CS1008: Type byte, sbyte, short, ushort, int, uint, long, or ulong expected

As byte is an alias to the actual type, System.Byte, why can't I use the second declaration ?

like image 528
Steve B Avatar asked Feb 15 '11 15:02

Steve B


People also ask

How do you inherit an enum?

Enums cannot inherit from other enums. In fact all enums must actually inherit from System. Enum . C# allows syntax to change the underlying representation of the enum values which looks like inheritance, but in actuality they still inherit from System.

Is an enum a byte?

The enum can be of any numeric data type such as byte, sbyte, short, ushort, int, uint, long, or ulong.

Can enum inherit enum?

Inheritance Is Not Allowed for Enums.

Is it possible to inherit enum in C#?

Nope. it is not possible. Enum can not inherit in derived class because by default Enum is sealed.


2 Answers

There are a number of questions raised here.

Why can't I declare an enum inheriting from Byte but I can from byte?

As others have noted, that is what the specification says.

The language design committee notes do not justify this decision. The notes from June of 2000 say


The keywords for the predefined types (e.g., int) and their corresponding type names (e.g., Int32) can mostly but not entirely be used interchangeably. We discussed whether we wanted to make any changes in this area. We didn’t.

Here is the list of places where they are not interchangeable.

  • "int" can be used as an underlying type for enums; Int32 can't.
  • Alias declarations can't use keywords.

Some musings of mine on the subject:

The first thing that comes to mind is that every time you give a user a choice, you give them an opportunity to write a bug, and every time you give them an opportunity to write a bug you have to make an error message for it. If we allowed "enum X : Byte" then we have give a sensible error message when the user has accidentally erased the "using System;". We can avoid that potential confusion and all the costs of developing and testing an error-reporting heuristic by simply not allowing the choice in the first place.

The second thing that comes to mind is that the underlying type of an enum is fundamentally about the mechanism of the enum, not its meaning. It therefore seems plausible that the underlying type clause should be limited to things that require no semantic analysis; we know that the underlying type is one of eight possible types, so let's just have the user mechanically choose from one of those eight types unambiguously.

The third thing that comes to mind is that the error analysis can be performed during syntactic analysis rather than semantic analysis. The earlier the error is caught, the better.

UPDATE: I just asked one of the people who was in the room that day whether there was anything that I missed in my musings and he said that yeah, they thought about it and decided that doing the full type analysis was work for the development, test and maintenance team, work which bought the user nothing of value. (He also noted that they had similar arguments over System.Void; should it be legal to say "public static System.Void Main(string[] args)" for example? Again, they decided that this added no value for users but did add potential ambiguity and work for the team.)


Why is "char" not a legal underlying type?

Again, that's what the spec says. Again, the language design notes from October 1999 are no help in determining why:


The unsigned integral types may be used as the underlying types for enums. The only integral type that cannot be used is char.


Again, we can guess. My guess would be that enums are intended to be fancy numbers. Chars are in practice integers as an implementation detail, but logically they are not numbers, they are characters. We want to be able to do operations on enums like addition, "or-ing" and "and-ing" flags, and so on; the spec is clear that these operations are done as though they were being done on the underlying type. The char type, not logically being a number, does not define all the operators you might want. And finally, if you want a two-byte enum value then you already have short and ushort at your disposal.


A related question from an email about this question:

A careful reading of the grammar specification says that 'char' is grammatically a legal underlying type, but the paragraph of explanatory text after the grammar says that 'char' is not a legal underlying type. Is the spec inconsistent?

Yes. I'm not losing sleep over it. If it makes you feel better, imagine that the grammar line that says

enum-base : integral-type

instead reads

enum-base : integral-type (but not char)

like image 129
Eric Lippert Avatar answered Oct 16 '22 18:10

Eric Lippert


Well, it's according to the specification (§14.1). The grammar specifies that the production for enum-declaration is

enum-declaration:
    attributes_opt   enum-modifiers_opt   enum   identifier   enum-base_opt   enum-body   ;_opt

where

enum-base is

:integral-type

and

integral-type:
sbyte
byte
short
ushort
int
uint
long
ulong
char

As for the why the specification is this way, it is not clear.

Note that char is listed as a terminal for integral-type but the specification explicitly states that

Note that char cannot be used as an underlying type.

By the way, I really think best practice here is to use the aliases. I only use the .NET name instead of the C# keyword for these primitive types (and string) when I want to invoke a static method. So

Int32.TryParse

instead of

int.TryParse.

Otherwise, I say, e.g., typeof(int) and not typeof(Int32).

like image 26
jason Avatar answered Oct 16 '22 17:10

jason