This feels like a stupid question, but I can't find the answer anywhere in the Java documentation. If I declare two ints and then divide them, what exactly is happening? Are they converted to floats/doubles
first, divided, then cast back to an integer
, or is the division "done" as integers?
Also, purely from experimentation, integer division seems to round the answer towards zero (i.e. 3/2 = 1
and -3/2 = -1
). Am I right in believing this?
When one of the operands to a division is a double and the other is an int, Java implicitly (i.e. behind your back) casts the int operand to a double. Thus, Java performs the real division 7.0 / 3.0.
When dividing two integers, Java uses integer division. In integer division, the result is also an integer. The result is truncated (the fractional part is thrown away) and not rounded to the closest integer.
x) that in Java, dividing two integers will always use floor division.
They are being divided in integer arithmetics. So dividing integer a
by integer b
you get how many times b
fits into a
. Also a % b
will give you a remainder of a division. So (a / b ) * b + a % b = a
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With