While playing around I met something, that seems strange to me:
The following isn't valid Java code:
char x = 'A';
x = x + 1; //possible loss of precision
because one of the operands ist an integer and so the other operand is converted to integer. The result couldn't be assigned to a character variable ... while
char x = 'A';
x += 1;
is valid, because the resulting integer is - automatically - converted to character.
So far so good. This seems clear to me but ... why is the following valid Java code?
char x;
x = 'A' + 1;
Because
'A' + 1
is a constant expression. It is known at compile time that the result will fit in a char.
Whereas
'A' + 787282;
will not fit in a char and will therefore cause a compilation error.
It is valid because it is a compile time constant expression. Had it been
char x;
char y = 'A';
x = y + 1;
The compiler will give you a compile time error, because now it is not a compile time constant expression. But if you will make the variable y as final the expression will turn again into compile time constant , thus code below will compile.
char x;
final char y = 'A';
x = y + 1;
Moral of the story is that when you assign integer to a char , the compiler will allow it as long as it is compiler time constant and it should fit in the range of the char.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With