Java provides an cryptographically secure random number generator in the package java.secure.random.
Is it possible to use this number generator if I consider things like seeding and cyclic re-instantiation of the RNG? Or can I use the number generator 'as it is'?
Has anyone experience with this generator?
EDIT: the requirements are:
a) Be statistically independent
b) Be fairly distributed (within statistically expected bounds) over their range
c) Pass various recognized statistical tests
d) Be cryptographically strong.
Due to complex algorithm used in case of SecureRandom which make it more unpredictable,it takes more memory consumption in create secure random numbers than random numbers. Random class has only 48 bits where as SecureRandom can have upto 128 bits which makes the probability of repeating in SecureRandom are smaller.
Yes, it is secure, as long as nextInt() is secure (for the number of integers retrieved from the stream). According to the documentation of the Random#ints() method: A pseudorandom int value is generated as if it's the result of calling the method nextInt() .
Random Number Generation These created values are not truly "random" because a mathematical formula is used to generate the values. The "random" numbers generated by the mathematical algorithm are given a starting number (called the "seed") and always generates the same sequence of numbers.
RANDOM.ORG offers true random numbers to anyone on the Internet. The randomness comes from atmospheric noise, which for many purposes is better than the pseudo-random number algorithms typically used in computer programs.
As others say, the secure RNG can have limited throughput. To mitigate this you can either stretch that secure randomness by seeding a CPRNG, or you can try to optimise your usage of the bitstream.
To shuffle a pack of cards, for example, you need only 226 bits, but a naive
algorithm (calling nextInt(n)
for each card) will likely use 1600 or 3200
bits, wasting 85% of your entropy and making you seven times more susceptible
to delays.
For this situation I think the Doctor Jacques method would be appropriate.
To go with that, here's some performance analysis against progressively more costly entropy sources (also contains code):
Bit recycling for scaling random number generators
I would lean towards efficient usage rather than stretching, because I think it would be a lot easier to prove the fairness of an efficient consumer of a trustworthy entropy stream, than to prove the fairness of any drawing method with a well-seeded PRNG.
EDIT2: I don't really know Java, but I put this together:
public class MySecureRandom extends java.security.SecureRandom {
private long m = 1;
private long r = 0;
@Override
public final int nextInt(int n) {
while (true) {
if (m < 0x80000000L) {
m <<= 32;
r <<= 32;
r += (long)next(32) - Integer.MIN_VALUE;
}
long q = m / n;
if (r < n * q) {
int x = (int)(r % n);
m = q;
r /= n;
return x;
}
m -= n * q;
r -= n * q;
}
}
}
This does away with the greedy default uniform [0,n-1] generator and replaces it with a modified Doctor Jacques version. Timing a card-shuffle range of values shows almost a 6x speed-up over the SecureRandom.nextInt(n)
.
My previous version of this code (only 2x speed-up) assumed that SecureRandom.next(b)
was efficient, but it turns out that call was discarding entropy and dragging the whole loop down. This version manages its own chunking.
Documentation about SecureRandom says it may block waiting for the system to generate more entropy (for instance, in Linux it takes random numbers from /dev/random), so if you are going to use it, maybe you will need some help from hardware: install random number generator card (a hardware device that generates real random numbers, not pseudo-random ones) this way your system will generate random numbers with enough speed so your program doesn't get blocked.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With