I'm trying to convert an enum from C++ code over to C# code, and I'm having trouble wrapping my head around it. The C++ code is:
enum FOO {
FOO_1 = 0,
FOO_2,
// etc
}
#define MASK(x) ((1 << 16) | (x))
enum DISP
{
DISP_1 = MASK(FOO_1),
DISP_2 = MASK(FOO_2),
// etc
}
What I don't understand is what MASK is doing, and how I can either emulate the functionality in C#, or understand what it's doing and set the enum DISP manually without it.
I'm not sure what I'm saying is making sense, but that's to be expected when I'm not entirely certain what I'm looking at.
when you bit shift, it shifts all the 1's and 0's to the left or right by some number of values.
in your case 1 << 16
creates 10000000000000000 in binary. (yes that is 16 zeros).
Then it takes that number and uses |
which is the bitwise-or operator. thus, whatever the integer value of the enum is, it gets bitwise-ored into that number we bit shifted.
If for example, you use MASK(FOO_4)
(which has the literal value of 3) 3 is 11 in binary, so the result would be 10000000000000011. This is functionally the same as adding 65,536 to each value.
Now, when we declare the second enum, we are setting the values of those enum values to this weird masking function.
To do the same thing in c# try this:
enum Foo { //this may not be needed anymore?
FOO_1 = 0, FOO_2, ... etc
}
enum Disp { //DISP_2 gets the next value ie 65536 + 1, and so forth
DISP_1 = 65536, DISP_2, DISP_3, ... etc
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With