Why does char overflow when multiplied by an integer literal and unsigned char doesn't and it casts to int or unsigned int?
#include <memory>
#include <iostream>
using namespace std;
int main(void)
{
char MAX1=200;
unsigned char MAX2=200;
cout << MAX1*3 << endl;
cout << MAX2*3 << endl;
return 0;
}
The above code otputs:
-168
600
The difference is not in the multiplication as you assumed, but in the assignment at the beginning. The important thing to note here is that it's implementation defined whether char is signed or unsigned.
On your platform, char is apparently signed and thus cannot represent 200. That leaves us with a variable MAX1 which has an implementation defined value, in our case -56. (Due to 2s complement on common platforms.)
Then, the multiplication part is the same for both variables: The operands get promoted to int because the conversion rank of (unsigned) char is less than the conversion rank of int and we end up with the equivalent of
cout << int(-56) * 3 << endl;
cout << int(200) * 3 << endl;
which prints
-168
600
as one would expect.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With