Given the following enum:
public enum Operations_PerHourType : byte { Holes = 1, Pieces = 2, Sheets = 3, Strips = 4, Studs = 5 }
When I run the Microsoft code analysis tool, it tells me:
CA1028 : Microsoft.Design : If possible, make the underlying type of 'Enums.Operations_PerHourType' System.Int32 instead of 'byte'.
It will never have more than a couple possible values, so I declared it as a byte. Why would they recommend using int32? More values for future scalability? Or is there a performance improvement?
Each enum type has a corresponding integral type called the underlying type of the enum type. This underlying type shall be able to represent all the enumerator values defined in the enumeration. If the enum_base is present, it explicitly declares the underlying type.
Enums are lists of constants. When you need a predefined list of values which do represent some kind of numeric or textual data, you should use an enum. You should always use enums when a variable (especially a method parameter) can only take one out of a small set of possible values.
Advantages of enum: enum improves type safety at compile-time checking to avoid errors at run-time. enum can be easily used in switch. enum can be traversed. enum can have fields, constructors and methods.
Have a look on MSDN for the reason.
Here is an excerpt:
An enumeration is a value type that defines a set of related named constants. By default, the System.Int32 data type is used to store the constant value. Even though you can change this underlying type, it is not necessary or recommended for most scenarios. Note that no significant performance gain is achieved by using a data type that is smaller than Int32. If you cannot use the default data type, you should use one of the Common Language System (CLS)-compliant integral types, Byte, Int16, Int32, or Int64 to make sure that all values of the enumeration can be represented in CLS-compliant programming languages.
There are specific situations where narrowing the underlying type brings some advantages, for example performance related or forcing a particular memory layout when interfacing to unmanaged code.
Consider this sample:
using System; public enum Operations_PerHourType // : byte { Holes = 1, Pieces = 2, Sheets = 3, Strips = 4, Studs = 5 } class Program { static void Main() { long before = GC.GetTotalMemory(false); var enums = new Operations_PerHourType[10000]; long after = GC.GetTotalMemory(false); Console.WriteLine(after - before); // output (byte): 12218 (I'm using Mono 2.8) // output (Int32): 40960 } }
This code consumes roughly 40 KB of the heap. Now specify (uncomment) the underlying type as byte
and recompile. Wow. Suddenly we only need roughly 10 KB.
Compacting memory like this may sometimes make a program slower, not faster, depending on particular access patterns and data sizes. There is no way to know for sure than to make some measurements and attempt to generalize to other possible circumstances. Sequential traversal of smaller data is usually faster.
However, developing a habit of specifying narrow types just because it is usually possible and sometimes crucial, is not a good idea. Memory savings rarely materialize due to memory alignment of surrounding wider data types. Performance is then either the same or slightly worse due to additional instructions needed to mask away padding bytes.
As another answer has already put it well, follow the Int32
crowd that the runtime is optimized for, until you have to start profiling and addressing real memory hogs in your application.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With