Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why 0.1 + 0.1 == 0.2?

This is concerning Java. From what I've understood, 0.1 cannot be perfectly represented by Java because of binary representations. That makes

0.1 + 0.1 + 0.1 == 0.3

false. However, why does

0.1 + 0.1 == 0.2

gives true?

like image 960
Kezer Avatar asked Nov 25 '17 10:11

Kezer


1 Answers

0.1 cannot be perfectly represented by Java because of binary representations. That makes

0.1 + 0.1 + 0.1 == 0.3

false.

That is not the entire reason why the equality is false, although it is part of it. 0.3 is not exactly 3/10 either. It so happens that 0.2 is exactly twice 0.1 (although they are not respectively 2/10 and 1/10), and that adding 0.1 to itself produces the value that is also the one you get when you type the constant 0.2. On the other hand, the overall approximation that you get after the operations 0.1 + 0.1 + 0.1 is slightly different of the approximation separating 0.3 from 3/10.

If we were using decimal with 5 significant digits, you might be surprised that 1 / 3 * 3 == 1 does not hold (1 / 3 would compute as 0.33333 and that times 3 would compute as 0.99999, which is different from 1), whereas 1 / 4 * 4 == 1 does hold (1 / 4 would compute as 0.25, and that times 4 would compute as 1).

Your question is somewhat similar to this, but for base-2 computations. Every constant and operation is an opportunity for an approximation. Sometimes the approximations do not happen, and sometimes they happen but cancel out, so that the end result is more accurate than you had a right to expect. In the case of 0.1 + 0.1, the result is not 2/10, but it is the same approximation of 2/10 that you get when you write 0.2, so that the equality holds. With 0.1 + 0.1 + 0.1 we happen not to be so lucky.

like image 58
Pascal Cuoq Avatar answered Oct 03 '22 06:10

Pascal Cuoq