I know about Wikipedia and MacKay's Information Theory, Inference, and Learning Algorithms (is it appropriate as textbook?). A textbook starting with Shannon's entropy and going through Conditional entropy and Mutual information is sought... Any idea? If you are following such a course at your university, which textbook is used?
Thanks.
Information theory is a branch of mathematics that overlaps into communications engineering, biology, medical science, sociology, and psychology. The theory is devoted to the discovery and exploration of mathematical laws that govern the behavior of data as it is transferred, stored, or retrieved.
The goal of information theory is to quantify the amount of information contained in a signal, as well as the capacity of a channel or communication medium for sending information.
Information theory was created to find practical ways to make better, more efficient codes and find the limits on how fast computers could process digital signals. Every piece of digital information is the result of codes that have been examined and improved using Shannon's equation.
The essence of information theory is solving the question of what information is mathematically. The term “information” may conjure up images of computers and information networks. However, the human gene sequence is information, and logical thinking and judgment can be regarded as processes for information processing.
I used the following textbook during my studies in CS at EPFL. IMO, it's well written, with good explainations, and covers more than enough for an introduction to the domain.
Elements of Information Theory
EDIT: For further reading, here are some other readings that my professor did recommend. I did not read them (shame on me), so I can't say if they're good or not.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With