Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Value of C define changes unexpectedly

I have a lot of #define's in my code. Now a weird problem has crept up.

I have this:

#define _ImmSign    010100

(I'm trying to simulate a binary number)

Obviously, I expect the number to become 10100. But when I use the number it has changed into 4160.

What is happening here? And how do I stop it?

ADDITIONAL

Okay, so this is due to the language interpreting this as an octal. Is there some smart way however to force the language to interpret the numbers as integers? If a leading 0 defines octal, and 0x defines hexadecimal now that I think of it...

like image 931
NomeN Avatar asked Apr 18 '10 21:04

NomeN


1 Answers

Integer literals starting with a 0 are interpreted as octal, not decimal, in the same way that integer literals starting with 0x are interpreted as hexadecimal.

Remove the leading zero and you should be good to go.

Note also that identifiers beginning with an underscore followed by a capital letter or another underscore are reserved for the implementation, so you shouldn't define them in your code.

like image 145
James McNellis Avatar answered Oct 21 '22 08:10

James McNellis