Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Calculation of mutual information in R

I am having problems interpreting the results of the mi.plugin() (or mi.empirical()) function from the entropy package. As far as I understand, an MI=0 tells you that the two variables that you are comparing are completely independent; and as MI increases, the association between the two variables is increasingly non-random.

Why, then, do I get a value of 0 when running the following in R (using the {entropy} package):

mi.plugin( rbind( c(1, 2, 3), c(1, 2, 3) ) )

when I'm comparing two vectors that are exactly the same?

I assume my confusion is based on a theoretical misunderstanding on my part, can someone tell me where I've gone wrong?

Thanks in advance.

like image 867
lemhop Avatar asked Sep 11 '14 15:09

lemhop


People also ask

How is mutual information calculated?

The mutual information can also be calculated as the KL divergence between the joint probability distribution and the product of the marginal probabilities for each variable. — Page 57, Pattern Recognition and Machine Learning, 2006. This can be stated formally as follows: I(X ; Y) = KL(p(X, Y) || p(X) * p(Y))

How do you calculate entropy in R?

It is given by the formula H = − ∑ ( π l o g ( π ) ) where is the probability of character number i showing up in a stream of characters of the given "script". The entropy is ranging from 0 to Inf.

Is mutual information a correlation?

Mutual information (MI) is often used as a generalized correlation measure. It is not clear how much MI adds beyond standard (robust) correlation measures or regression model based association measures.

What is mutual information in ITC?

Mutual information is one of many quantities that measures how much one random variables tells us about another. It is a dimensionless quantity with (generally) units of bits, and can be thought of as the reduction in uncertainty about one random variable given knowledge of another.


1 Answers

Use mutinformation(x,y) from package infotheo.

> mutinformation(c(1, 2, 3), c(1, 2, 3) ) 
[1] 1.098612

> mutinformation(seq(1:5),seq(1:5))
[1] 1.609438

and normalized mutual information will be 1.

like image 112
Monicam Avatar answered Oct 14 '22 16:10

Monicam