Consider this code:
public class ShortDivision {
public static void main(String[] args) {
short i = 2;
short j = 1;
short k = i/j;
}
}
Compiling this produces the error
ShortDivision.java:5: possible loss of precision
found : int
required: short
short k = i/j;
because the type of the expression i/j is apparently int, and hence must be cast to short.
Why is the type of i/j
not short?
When dividing two integers, Java uses integer division. In integer division, the result is also an integer. The result is truncated (the fractional part is thrown away) and not rounded to the closest integer.
Java does integer division, which basically is the same as regular real division, but you throw away the remainder (or fraction). Thus, 7 / 3 is 2 with a remainder of 1. Throw away the remainder, and the result is 2. Integer division can come in very handy.
What should the result be for adding an int and a short? What should the result be for multiplying two shorts? The result in general will fit into an int, but because we truncate to short, most multiplications will fail silently.
There are two types of division in Java—integer division and floating-point division. Both types use the forward slash (/) symbol as the operator, following the format dividend / divisor.
From the Java spec:
5.6.2 Binary Numeric Promotion
When an operator applies binary numeric promotion to a pair of operands, each of which must denote a value of a numeric type, the following rules apply, in order, using widening conversion (§5.1.2) to convert operands as necessary:
If either operand is of type double, the other is converted to double.
Otherwise, if either operand is of type float, the other is converted to float.
Otherwise, if either operand is of type long, the other is converted to long.
Otherwise, both operands are converted to type int.
For binary operations, small integer types are promoted to int
and the result of the operation is int
.
EDIT: Why is it like that? The short answer is that Java copied this behavior from C. A longer answer might have to do with the fact that all modern machines do at least 32-bit native computations, and it might actually be harder for some machines to do 8-bit and 16-bit operations.
See also: OR-ing bytes in C# gives int
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With