Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the computer science definition of entropy?

I've recently started a course on data compression at my university. However, I find the use of the term "entropy" as it applies to computer science rather ambiguous. As far as I can tell, it roughly translates to the "randomness" of a system or structure.

What is the proper definition of computer science "entropy"?

like image 604
fluffels Avatar asked Feb 04 '09 07:02

fluffels


People also ask

What is entropy with example?

examples of entropy in everyday life. Entropy measures how much thermal energy or heat per temperature. Campfire, Ice melting, salt or sugar dissolving, popcorn making, and boiling water are some entropy examples in your kitchen.

What does entropy mean in cyber security?

A measure of the amount of uncertainty an attacker faces to determine the value of a secret.

What is entropy and its unit?

Entropy is a measure of randomness or disorder of the system. The greater the randomness, the higher the entropy. It is state function and extensive property. Its unit is JK−1mol−1.

What does data entropy mean?

In information theory, the entropy of a random variable is the average level of “information“, “surprise”, or “uncertainty” inherent in the variable's possible outcomes. That is, the more certain or the more deterministic an event is, the less information it will contain.


1 Answers

Entropy can mean different things:

Computing

In computing, entropy is the randomness collected by an operating system or application for use in cryptography or other uses that require random data. This randomness is often collected from hardware sources, either pre-existing ones such as mouse movements or specially provided randomness generators.

Information theory

In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information contained in a message, usually in units such as bits. Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable

Entropy in data compression

Entropy in data compression may denote the randomness of the data that you are inputing to the compression algorithm. The more the entropy, the lesser the compression ratio. That means the more random the text is, the lesser you can compress it.

Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.

like image 78
Niyaz Avatar answered Nov 11 '22 13:11

Niyaz