Why is Java inconsistent in comparing -0.0 and +0.0? What is the Java standard method for comparing numbers to account for -0/+0?
I have encountered this particular bugaboo:
public class ZeroCompare {
public static void main(String[] args) {
if ( 0.0 == -0.0 ) {
System.out.println("== --> same");
} else {
System.out.println("== --> different");
}
if ( new Double(0.0).equals( -0.0 ) ) {
System.out.println("equals --> same");
} else {
System.out.println("equals --> different");
}
}
}
It prints the following:
== --> same
equals --> different
I strongly dislike the fact that how you compare these two values affects the outcome and I'd love for an explanation.
This behaviour is actually documented:
If d1 represents +0.0 while d2 represents -0.0, or vice versa, the equal test has the value false, even though +0.0==-0.0 has the value true. This definition allows hash tables to operate properly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With