Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Javascript: Is This Truly Signed Integer Division

Given the following code, where both a and b are Numbers representing values within the range of signed 32-bit signed integers:

var quotient = ((a|0) / (b|0))|0;

and assuming that the runtime is in full compliance with the ECMAScript 6 specifications, will the value of quotient always be the correct signed integer division of a and b as integers? In other words, is this a proper method to achieve true signed integer division in JavaScript that is equivalent to the machine instruction?

like image 950
lcmylin Avatar asked Jul 11 '15 04:07

lcmylin


1 Answers

I'm no expert on floating-point numbers, but Wikipedia says that doubles have 52 bits of precision. Logically, it seems that 52 bits should be enough to reliably approximate integer division of 32-bit integers.

Dividing the minimum and maximum 32-bit signed ints, -2147483648 / 2147483647, produces -1.0000000004656613, which is still a reasonable amount of significant digits. The same goes for its inverse, 2147483647 / -2147483648, which produces -0.9999999995343387.

An exception is division by zero, which I mentioned in a comment. As the linked SO question states, integer division by zero normally throws some sort of error, whereas floating-point coercion results in (1 / 0) | 0 == 0.

Update: According to another SO answer, integer division in C truncates towards zero, which is what |0 does in JavaScript. In addition, division by 0 is undefined, so JavaScript is technically not incorrect in returning zero. Unless I've missed anything else, the answer to the original question should be yes.

Update 2: Relevant sections of the ECMAScript 6 spec: how to divide numbers and how to convert to a 32-bit signed integer, which is what |0 does.

like image 155
gengkev Avatar answered Oct 06 '22 00:10

gengkev