Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to find the "true" entropy of std::random_device?

Tags:

c++

random

c++11

I want to check whether my implementation of std::random_device has non-zero entropy (i.e. is non-deterministic), using std::random_device::entropy() function. However, according to cppreference.com

"This function is not fully implemented in some standard libraries. For example, gcc and clang always return zero even though the device is non-deterministic. In comparison, Visual C++ always returns 32, and boost.random returns 10."

Is there any way of finding the real entropy? In particular, do modern computers (MacBook Pro/iMac etc) have a non-deterministic source or randomness, like e.g. using heat dissipation monitors?

like image 490
vsoftco Avatar asked Feb 08 '15 05:02

vsoftco


Video Answer


4 Answers

I recommend you the lecture of this article.

Myths about /dev/urandom

§ 26.5.6

A random_device uniform random number generator produces non-deterministic random numbers.

If implementation limitations prevent generating non-deterministic random numbers, the implementation may employ a random number engine.

So basically it will try to use the internal system "true" random number generator, in linux /dev/{u}random o windows RltGenRandom.

A different point is you don't trust those sources of randomness because they depend on internal noise or are close implementations.

Additionally is how do you meassure the quality of entropy, as you know that is one of the biggest problem trying to find good rng generators.

One estimation could be extremely good and other estimation could report not so good entropy.

Entropy Estimation

In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations.

As it sais, you must rely on final observations, and those can be wrong.

I you think the internal rng is not good enough, you can always try to buy hardware devices for that purpose. This list on wikipedia has a list of vendors, you can check on the internet reviews about them.

Performance

One point you must consider is the performance within your application using real random number generators. One common technique is to use as seed in a mersenne twister a number obtained using /dev/random.

If the user can't access your system physically, you will need to balance reliability with availability, a system with security holes is as bad as one doesn't work, at the end you must have your important data encrypted.

Edit 1: As suggestion I have moved the article at the top of my comment, is a good read. Thanks for the hint :-).

like image 117
Jose Palma Avatar answered Oct 04 '22 01:10

Jose Palma


All the standard gives you is what you've already seen. You would need to know something about how a given standard library implements random_device in order to answer this question. For example, in Visual Studio 2013 Update 4, random_device forwards to rand_s which forwards to RtlGenRandom, which may actually be (always?) a cryptographically secure pseudorandom number generator depending on your Windows version and the hardware available.

If you don't trust the platform to provide a good source of entropy, then you should use your own cryptographically secure PRNG, such as one based on AES. That said, platform vendors have strong incentives for their random numbers to actually be random, and embedding the PRNG into your app means that the PRNG can't be updated as easily in the event it is found to be insecure. Only you can decide on that tradeoff for yourself :)

like image 38
Billy ONeal Avatar answered Oct 04 '22 01:10

Billy ONeal


Entropy is just one measure of RNG quality (and true, exact entropy is impossible to measure). For a practical and reasonably-accurate measurement of your std::random_device's random number quality, consider using a standard randomness test suite such as TestU01, diehard, or its successor dieharder. These run a battery of statistical tests designed to stress your RNG, ensuring it produces statistically random data.

Note that statistical randomness by itself does not certify that the RNG is suitable for cryptographic applications.


Many modern computers have easily-accessible sources of hardware randomness, namely the analog-to-digital converters found in the audio input, camera, and various sensors. These exhibit low-level thermal or electrical noise which can be exploited to produce high-quality random data. However, no OS that I know of actually uses these sensors to supply their system random-number sources (such as /dev/[u]random), since the bitrate of such physical random number sources tends to be very low.

Instead, OS-provided random number sources tend to be seeded by hardware counters and events, such as page faults, device driver events, and other sources of unpredictability. In theory, these events might be fully predictable given the precise hardware state (since they aren't based on e.g. quantum or thermal noise), but in practice they are sufficiently unpredictable that they produce good random data.

like image 41
nneonneo Avatar answered Oct 04 '22 02:10

nneonneo


Entropy as a scientific term is misused when describing random numbers. Complexity might be a better term. Entropy in physics is defined as the logarithm of the number of available quantum states (not useful in RNG), and entropy in information theory is defined by the Shannon entropy, but that is geared towards the other extreme - how to put as much information into a noisy bit stream, not how to minimize the information.

For example, the digits of Pi look random, but the actual entropy of the digits is zero once you know that they derive from Pi. Increasing "Entropy" in RNG is basically a question of making the source of the numbers as obscure as possible.

like image 42
Mark Lakata Avatar answered Oct 04 '22 00:10

Mark Lakata