Information theory comes into play where ever encoding & decoding is present. For example: compression(multimedia), cryptography.
In Information Theory we encounter terms like "Entropy", "Self Information", "Mutual Information" and entire subject is based on these terms. Which just sound nothing more than abstract. Frankly, they don't really make any sense.
Is there any book/material/explanation (if you can) which explains these things in a practical way?
EDIT:
An Introduction to Information Theory: symbols, signals & noise by John Robinson Pierce is The Book that explains it the way I want (practically). Its too good. I started reading it.
Shanon's original paper "A mathematical theory of communication" is one very very important resource for studying this theory. Nobody NOBODY should miss it.
By reading it you will understand how Shanon arrived at the theory which should clear most of the doubts.
Also studying workings of Huffman compression algorithm will be very helpful.
EDIT:
An Introduction to Information Theory
John R. Pierce
seems good according to the amazon reviews (I haven't tried it).
[by Googleing "information theory layman" ]
My own view on "Information Theory" is that it's essentially just applied math / statistics but because it's being applied to communications / signals it's been called "Information Theory".
The best way to start understanding the concepts is to set yourself a real task. Say for example take a few pages of your favourite blog save it as a text file and then attempt to reduce the size of the file whilst ensuring you can still reconstruct the file completely (I.e. lossless compression). You'll start for example replacing all the instances of and with a 1 for example....
I'm always of the opinion learning by doing will be the best approach
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With