I have a lot of #define
's in my code. Now a weird problem has crept up.
I have this:
#define _ImmSign 010100
(I'm trying to simulate a binary number)
Obviously, I expect the number to become 10100. But when I use the number it has changed into 4160.
What is happening here? And how do I stop it?
ADDITIONAL
Okay, so this is due to the language interpreting this as an octal. Is there some smart way however to force the language to interpret the numbers as integers? If a leading 0 defines octal, and 0x defines hexadecimal now that I think of it...
Integer literals starting with a 0
are interpreted as octal, not decimal, in the same way that integer literals starting with 0x
are interpreted as hexadecimal.
Remove the leading zero and you should be good to go.
Note also that identifiers beginning with an underscore followed by a capital letter or another underscore are reserved for the implementation, so you shouldn't define them in your code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With