Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to correctly multiply two 32 bit integers in Javascript?

Is there a way to correctly multiply two 32 bit integers in Javascript?

When I try this from C using long long I get this:

printf("0x%llx * %d = %llx\n", 0x4d98ee96ULL, 1812433253,
      0x4d98ee96ULL * 1812433253);
==> 0x4d98ee96 * 1812433253 = 20becd7b431e672e

But from Javascript the result is different:

x = 0x4d98ee97 * 1812433253;
print("0x4d98ee97 * 1812433253 = " + x.toString(16));
==> 0x4d98ee97 * 1812433253 = 20becd7baf25f000

The trailing zeros lead me to suspect that Javascript has an oddly limited integer resolution somewhere between 32 and 64 bits.

Is there a way to get a correct answer? (I'm using Mozilla js-1.8.5 on x86_64 Fedora 15 in case that matters.)

like image 509
antimeme Avatar asked Jun 03 '11 21:06

antimeme


1 Answers

This seems to do what I wanted without an external dependency:

function multiply_uint32(a, b) {
    var ah = (a >> 16) & 0xffff, al = a & 0xffff;
    var bh = (b >> 16) & 0xffff, bl = b & 0xffff;
    var high = ((ah * bl) + (al * bh)) & 0xffff;
    return ((high << 16)>>>0) + (al * bl);
}

This performs a 32-bit multiply modulo 2^32, which is the correct bottom half of the computation. A similar function could be used to calculate a correct top half and store it in a separate integer (ah * bh seems right), but I don't happen to need that.

Note the zero-shift. Without that the function generates negative values whenever the high bit is set.

like image 82
antimeme Avatar answered Nov 10 '22 08:11

antimeme