int main()
{
char MCU = 0b00000000;
char al_av = 0b10100000;
// Before bit operation
cout << "MCU = " << int(MCU) << endl;
MCU = MCU | al_av;
// After the bit operation
cout << "MCU = " << int(MCU) << endl; // Expected 160, got -96
char temp = 160;
cout << temp; // got the a with apostrophe
return 0;
}
I expected the output of char temp
to be a negative number (or a warning / error) because 160 exceeds the [-127,127] interval, but instead, the result was the one in the ASCII table (a with apostrophe)
On cpp reference:
char - type for character representation which can be most efficiently processed on the target system (has the same representation and alignment as either signed char or unsigned char, but is always a distinct type)
I don't understand what is written in italic (also I'm not sure it helps a lot for this question). Is there any implicit conversion ?
Why signed char can hold bigger values than 127?
It cannot.
char x = 231;
here, there is an (implicit) integer conversion: 231 is a prvalue of type int
and takes value -25
before it is converted to char
(which is signed on your system). You can ask your compiler to warn you about it with -Wconstant-conversion
.
char - type for character representation which can be most efficiently processed on the target system (has the same representation and alignment as either signed char or unsigned char, but is always a distinct type)
I don't understand what is written in italic
This isn't related to what the type can hold, it only ensures that the three types char
, signed char
and unsigned char
have common properties.
From C++14 char
, if signed
, must be a 2's complement type. That means that it has the range of at least -128 to +127. It's important to know that the range could be larger than this so it's incorrect to assume that a number greater than 127 cannot be stored in a char
if signed
. Use
std::numeric_limits<char>::max()
to get the real upper limit on your platform.
If you do assign a value larger than this to a char
and char
is signed
then the behaviour of your code is implementation defined. Typically that means wrap-around to a negative which is practically universal behaviour for a signed char
type.
Note also that ASCII is a 7 bit encoding, so it's wrong to say that any character outside the range 0 - 127 is ASCII. Note also that ASCII is not the only encoding supported by C++. There are others.
Finally, the distinct types: Even if char
is signed
, it is a different type from signed char
. This means that the code
int main() {
char c;
signed char d;
std::swap(c, d);
}
will always result in a compile error.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With