I was lost when reading
"Knowing how Linux behaves during entropy starvation (and being able to find the cause) allows us to efficiently use our server hardware."
in a blog. Then I wikied the meaning of 'entropy' in the context of linux. But still, not clear what "entropy starvation' is and the meaning of the sentence quoted above.
Some applications, notably cryptography, need random data. In cryptography, it is very important that the data be truly random, or at least unpredictable (even in part) to any attacker.
To supply this data, a system keeps a pool of random data, called entropy, that it collects from various sources of randomness on the system: Precise timing of events that might be somewhat random (keys pressed by users, interrupts from external devices), noise on a microphone, or, on some processors, dedicated hardware for generating random values. The incoming somewhat-random data is mixed together to produce better quality entropy.
These sources of randomness can only supply data at certain rates. If a system is used to do a lot of work that needs random data, it can use up more random data than is available. Then software that wants random data has to wait for more to be generated or it has to accept lower-quality data. This is called entropy starvation or entropy depletion.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With