The .NET reference source shows the implementation of NextBytes()
as:
for (int i=0; i<buffer.Length; i++)
{
buffer[i]=(byte)(InternalSample()%(Byte.MaxValue+1));
}
InternalSample
provides a value in [0, int.MaxValue), as evidenced by it's doc comment and the fact that Next()
, which is documented to return this range, simply calls InternalSample
.
My concern is that, since InternalSample
can produce int.MaxValue
different values, and that number is not evenly divisible by 256, then we should have some slight bias in the resulting bytes, with some values (in this case just 255) occurring less frequently than others.
My question is:
FYI I know Random
should not be used for cryptographic purposes; I'm thinking about it's valid use cases (e. g. simulations).
Your analysis is indeed correct. But the defect is one part in two billions i.e. 1 / 2^31
so fairly negligible.
The question that one should ask is, is it even detectable ? For example, how many samples N does one need to establish the bias with say 99% certainty. From what I know, N > s^2 z^2 / epsilon^2, with
this would require 4.77x10^17 samples, a number so large it will hardly be the most obvious defect.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With