I know what this means
#define M(B) (1U << ((sizeof(x) * CHAR_BIT) - B)) // CHAR_BIT=bits/byte
but I don't understand well this one:
#undef M
after this what happens? M is cleared or deleted or?
After the #undef
, it's as if the #define M...
line never existed.
int a = M(123); // error, M is undefined
#define M(B) (1U << ((sizeof(x) * CHAR_BIT) - B))
int b = M(123); // no error, M is defined
#undef M
int c = M(123); // error, M is undefined
Here is the MSDN article about it: http://msdn.microsoft.com/en-us/library/ts4w8783(VS.80).aspx
My understanding is that it removes the definition of M so that it may be used to define something else.
E.G.
#define M(X) 2*(X)
int a = M(2);
ASSERT(a == 4);
#undefine M
#define M(X) 3*(X)
int b = M(2);
ASSERT(b == 6);
It seems like a confusing thing to use but may come up in practice if you need to work with someone else's macros.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With