There is this Javascript function that I'm trying to rewrite in Java:
function normalizeHash(encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
}
return encondindRound2 % 1E6;
}
My Java adaptation:
public long normalizeHash(long encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (((int) encondindRound2) & 0x7fffffff) + 0x80000000;
}
return (((int) encondindRound2) % 1_000_000);
}
When I pass -1954896768
, Javascript version returns 70528
, while Java returns -896768
. I'm not sure why. The difference seems to start inside the if condition: in Javascript function after the if encodingRound2 = 2340070528
, while in Java: encodingRound2 = -1954896768
.
I made these repls to show it online:
Javascript: https://repl.it/repls/NumbGuiltyHack
Java: https://repl.it/repls/ClumsyQualifiedProblem
EDIT: Changing Java function to this
public long normalizeHash(long encondindRound2) {
if (encondindRound2 < 0) {
encondindRound2 = (encondindRound2 & 0x7fffffff) + 0x80000000;
}
return (encondindRound2 % 1_000_000);
}
doesn't seem to affect the result - it's still -896768
In Java, 0x80000000 is outside the range of a 32bit int, so it wraps around to -2147483648.
In JavaScript, 0x80000000 is well inside the range of a 64bit double, so it remains 2147483648.
Obviously, adding -2147483648
vs adding 2147483648
results in a very large discrepancy.
You can either use a long
0x80000000L in Java, or coerce your JS number into a 32bit int with (0x80000000|0)
, depending on which you want.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With