Conside the following for C# and Java,
double d = 3 / 2 * 3.2;
Java
System.out.println(d); // 3.2
C#
Console.WriteLine(d); //3.2
It skip the 3/2,
We know that the correct answer should be 4.8
if i change to
double d = 3.00 / 2 * 3.2;
I can get 4.8,
So i want to ask, if (3 / 2 * 3.2) is illegal , why eclipse and vs2008 have no error? And how to prevent this problem in both C# and Java?
3/2
is Integer division, whose result is 1
.
1 * 3.2
equals 3.2
, which is the result you receive.
This is a proper well-defined formula with well-defined expected results, hence no error by the compiler. Use 3.00
, it's the best straightforward way to force the compiler to use floating point math.
From Thorbjørn's answer: another option is to standardize all your floating point calculations by starting with 1.0*
, in your example it would be 1.0*3/2*3.2
.
3 / 2
is considered as an integer division, so the result comes out to be 1
.
Then, performing a multiplication between 1
and 3.2
causes the integer 1
to be promoted to floating point 1
, resulting in 3.2
.
The idea is:
// Both "3" and "2" are integers, so integer division is performed.
3 / 2 == 1
// "3.2" is a floating point value, while "1" is an integer, so "1" is
// promoted to an floating point value.
1 * 3.2 --> 1.0 * 3.2 == 3.2
When typing 2.0
, the decimal point tells Java to consider the literal as a floating point value (in this case, a double
), so the result of the calculation is 4.8
as expected. Without the decimal point, the value is a integer literal.
It is not an error, but an issue with how the literals are handled.
The following links have more information:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With