This program returns 0
in my machine:
#include <stdbool.h>
union U {
_Bool b;
char c;
};
int main(void) {
union U u;
u.c = 3;
_Bool b = u.b;
if (b == true) {
return 0;
} else {
return 1;
}
}
AFAICT, _Bool
is an integer type that can at least store 0
and 1
, and true
is the integral constant 1
. On my machine, _Bool
has a sizeof(_Bool) == 1
, and CHAR_BITS == 8
, which means that _Bool
has 256 representations.
I can't find much in the C standard about the trap representations of _Bool
, and I can't find whether creating a _Bool
with a representation different from 0
or 1
(on implementations that support more than two representations) is ok, and if it is ok, whether those representations denote true or false.
What I can find in the standard is what happens when a _Bool
is compared with an integer, the integer is converted to the 0
representation if it has value 0
, and to the 1
representation if it has a value different than zero, such that the snippet above ends up comparing two _Bool
s with different representations: _Bool[3] == _Bool[1]
.
I can't find much in the C standard about what the result of such a comparison is. Since _Bool
is an integer type, I'd expect the rules for integers to apply, such that the equality comparison only returns true if the representations are equal, which is not the case here.
Since on my platform this program returns 0
, it would appear that this rule is not applying here.
Why does this code behave like this ? (i.e. what am I missing? Which representations of _Bool
are trap representations and which ones aren't? How many representations can represent true
and false
? What role do padding bits play into this? etc. )
What can portable C programs assume about the representation of _Bool
?
C does not have boolean data types, and normally uses integers for boolean testing. Zero is used to represent false, and One is used to represent true.
Like in C, the integers 0 (false) and 1 (true—in fact any nonzero integer) are used.
Footnote 122 in the C11 standard says:
While the number of bits in a _Bool object is at least CHAR_BIT, the width (number of sign and value bits) of a _Bool may be just 1 bit.
So on a compiler where _Bool
has only one value bit, only one of the bits of the char
will have effect when you read it from memory as a _Bool
. The other bits are padding bits which are ignored.
When I test your code with GCC, the _Bool
member gets a value of 1 when assigning an odd number to u.c
and 0 when assigning an even number, suggesting that it only looks at the lowest bit.
Note that the above is true only for type-punning. If you instead convert (implicit or explicit cast) a char
to a _Bool
, the value will be 1 if the char
was nonzero.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With