Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Integer arithmetic in Java with char and integer literal

Tags:

Can someone explain to me why the following code compiles OK in Java?

char c = 'a' + 10; 

Why is this not equivalent to the following, which does not compile?

int i = 10; char c = 'a' + i; 

The Java Language Specification (section 3.10.1) states "An integer literal is of type long if it is suffixed with an ASCII letter L or l (ell); otherwise it is of type int (§4.2.1)." Section 4.2.2 refers to "The numerical operators, which result in a value of type int or long." So the result of the addition should, in my understanding, be an int, which cannot be assigned to the char variable c.

However, it compiles fine (at least in Sun JDK 1.6.0 release 17 and in Eclipse Helios).

Rather an artificial example perhaps, but it is used in an introductory Java course I have been teaching, and it now occurs to me that I don't really understand why it works.

like image 376
Ben Avatar asked Sep 09 '10 21:09

Ben


People also ask

What is integer arithmetic in Java?

The Java virtual machine offers bytecodes that perform integer arithmetic operations on ints and longs. Values of type byte, short, and char are converted to int before they take part in arithmetic operations.

Can we add char and int in Java?

In Java, char and int are compatible types so just add them with + operator. char c = 'c'; int x = 10; c + x results in an integer, so you need an explicit casting to assign it to your character varaible back.

Can a char represent an integer?

char in java is defined to be a UTF-16 character, which means it is a 2 byte value somewhere between 0 and 65535. This can be easily interpreted as an integer (the math concept, not int ). char in c is defined to be a 1 byte character somewhere between 0 and 255.

Can you perform arithmetic operations on char?

Because you can treat them as numeric values as well as characters, you can perform arithmetic operations with them. char number = 40; The initializing value must be within the range of values that a 1-byte variable can store; so with my compiler, where char is a signed type, it must be between -128 and 127.


2 Answers

It is because the compiler can check that it ('a' + 10) is within the bounds of a char whereas it cannot (in general) check that 'a' + <an integer> is within the bounds.

like image 114
einarmagnus Avatar answered Oct 12 '22 08:10

einarmagnus


'a' + 10 is a compile-time constant expression with the value of 'k', which can initialise a variable of type char. This is the same as being able to assign a byte variable with a literal integer in [-128, 127]. A byte in the range of [128, 255] may be more annoying.

like image 39
Tom Hawtin - tackline Avatar answered Oct 12 '22 07:10

Tom Hawtin - tackline