You cannot convert from int to char, so this would be illegal
int i = 88; char c = i;
,
However this is allowed char c = 88;
.
Isn't a plain number and int
literal? How is this allowed?
In Java, char can be converted to int value using the following methods: Implicit type casting ( getting ASCII values ) Character. getNumericValue()
We can also use the getNumericValue() method of the Character class to convert the char type variable into int type. Here, as we can see the getNumericValue() method returns the numeric value of the character. The character '5' is converted into an integer 5 and the character '9' is converted into an integer 9 .
In Java, char and int are compatible types so just add them with + operator. char c = 'c'; int x = 10; c + x results in an integer, so you need an explicit casting to assign it to your character varaible back.
char
is effectively an unsigned 16-bit integer type in Java.
Like other integer types, you can perform an assignment conversion from an integer constant to any integer type so long as it's in the appropriate range. That's why
byte b = 10;
works too.
From the JLS, section 5.2:
In addition, if the expression is a constant expression (§15.28) of type byte, short, char or int :
- A narrowing primitive conversion may be used if the type of the variable is byte, short, or char, and the value of the constant expression is representable in the type of the variable.
- A narrowing primitive conversion followed by a boxing conversion may be used if the type of the variable is :
- Byte and the value of the constant expression is representable in the type byte.
- Short and the value of the constant expression is representable in the type short.
- Character and the value of the constant expression is representable in the type char.
Actually, converting from int
to char
is legal, it just requires an explicit cast because it can potentially lose data:
int i = 88;
char c = (char) i;
However, with the literal, the compiler knows whether it will fit into a char
without losing data and only complains when you use a literal that is too big to fit into a char
:
char c = 70000; // compiler error
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With