Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why no non-integral enums?

Why is it that non-integral enums cannot be created? I want to know if this is a language design decision, or if there are issues with implementing this in the compiler.

In other words, is it feasible to implement non-integral enums into the language, but there just isn't a justifiable need? Or if it isn't feasible but is justifiable, what impediment are in the way?

Someone give me the skinny on what the reason or rationale is for not having this available in C#, pretty please.

like image 377
core Avatar asked Mar 23 '09 03:03

core


People also ask

Is enum an integral type?

An enumeration is a user-defined type that consists of a set of named integral constants that are known as enumerators.

Do enums have to be ints?

The enum can be of any numeric data type such as byte, sbyte, short, ushort, int, uint, long, or ulong. However, an enum cannot be a string type.

What are the advantages of using an enum over an int?

The advantages of using enums are that they are very easy to use and represented as strings but processed as integers. Enums are easy to maintain and improve code readability because they provide symbolic named constants, which means you need to remember the names, not the integer values.

Can enum be treated as int?

An Enum value cannot be treated as an int by default because then you would be able to provide any integer and there would be no compile time check to validate that the provided integer does in fact exist as a value in the Enumeration.


2 Answers

There's no technical reason why it couldn't be done, but then, we are really more talking about a set of constants (perhaps within a common namespace, if they're related).

In an enum, the numerical value is generally secondary.

In the following:

enum Fruit {
  Apple,
  Orange,
  Pear,
  Peach
}

we are only describing a set of named constants. We are saying that a variable of type Fruit can take one of these four values. What the integer value of each one is, is really not relevant. As long as I only refer to them by name, it doesn't matter if the value representing Apple is 0, 1, 32, -53 or 0.002534f.

Most languages do allow you to specify the value that should represent each, but that's really secondary. It's sometimes useful, but it's not part of the core purpose with enums. They are simply there to give you a set of related named constants without having to specify an integer ID for each.

Further, enums are often used to specify optional flags that can be combined with bitwise operations. That is trivial to implement if each value is represented by an integer (then you just have to pick an integer that uses the bit pattern you want). Floating-point values wouldn't be useful in this context. (bit-wise and/or operations don't make much sense on floating-point values)

like image 133
jalf Avatar answered Oct 11 '22 12:10

jalf


I think enums were made this way because, as opposed to what you want to do, enumerations weren't meant to hold any meaningful numerical representation.

For example, enums are perfect for describing the ff:

enum Color
{
   Black,
   White,
   Red,
   Yellow,
   Blue,
   //everything else in between
}

(admittedly color can be represented by complex types numerically, but indulge me for the moment). How about mood?

enum Mood
{
    Happy,
    Giddy,
    Angry,
    Depressed,
    Sad
}

or taste?

enum Taste
{
    Bitter,
    Salty,
    Sweet,
    Spicy
}

I think you get the point. Bottomline is: enumerations were meant to represent objects or object characteristics that are difficult to represent numerically, or do not have meaningful or practical numerical representations -- and thus the arbitrary assignment to an integer which is the most convenient data type for such.

This is as opposed to things like holidays, which are numerically meaningful.

like image 43
Jon Limjap Avatar answered Oct 11 '22 14:10

Jon Limjap