Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

The probability of a randomly generated floating point number is in a certain range

Let's say I randomly generate a number and check if it is in a certain range. For integer, it's simple. For example with unsigned 8 bits, the randomly generated number can be in range (0 - 5 inclusive) with the probability of (6/2^8).

My question is how can I calculate the same thing with floating point number. For example, when I just randomly generate 32bits, what is the probability that the number is within a range of -10.0 and 10.0?

like image 578
prosseek Avatar asked Oct 21 '22 07:10

prosseek


1 Answers

Assuming a binary representation, the probability can be computed for ranges [2^n,2^n+1)
For example, if exponent is on 11 bits, probability is 1/2^12 (taking sign into account)

Inside such interval, it's possible to consider a uniform density of floating point.

Then I guess you could try and decompose your interval into such powers of 2 boundaries.
Then compute the probability of each interval, and sum them all...
Assuming there is a IEEE-754-like denormal representation, for the smallest possible exponent e, interval is [0,2^e[
So this should give you a rather simple procedure, but I see no simple formula.

For very accurate probabilities, you'll have to deal with the significand bit pattern of nearest representable float inside the interval for lower and upper bounds.

like image 123
aka.nice Avatar answered Oct 24 '22 00:10

aka.nice