Why does the following code not generate an error?
System.out.println((char) 2147483647);
According to oracle datatypes, the maximum size for a char
is 65,535
.
- char: The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).
Why can I generate char beyond max value?
2147483647
is not a char
but an int
.
You're not assigning an invalid value to a char
, you're casting a valid int
to char
and then Narrowing Primitive Conversion rules apply. See Java specs: §5.1.3.
In short you keep lowest 16 bits of original integer ("A narrowing conversion of a signed integer to an integral type T simply discards all but the n lowest order bits, where n is the number of bits used to represent type T.").
Why does the following code not generate an error?
Because it's not an error, it's a well-defined behavior.
You do a narrowing conversion from int
to char
, which is allowed see java spec: 5.1.3. Narrowing Primitive Conversion:
A narrowing conversion of a signed integer to an integral type
T
simply discards all but the n lowest order bits, where n is the number of bits used to represent typeT
. In addition to a possible loss of information about the magnitude of the numeric value, this may cause the sign of the resulting value to differ from the sign of the input value.
The resulting char
isn't larger than Character.MAX_VALUE
.
The compiler converts (char) 2147483647
to 65535
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With