TOPICS
Search

Mutual Information


The mutual information between two discrete random variables X and Y is defined to be

 I(X;Y)=sum_(x in X)sum_(y in Y)P(x,y)log_2((P(x,y))/(P(x)P(y)))
(1)

bits. Additional properties are

I(X;Y)=I(Y;X)
(2)
I(X;Y)>=0,
(3)

and

 I(X;Y)=H(X)+H(Y)-H(X,Y),
(4)

where H(X) is the entropy of the random variable X and H(X,Y) is the joint entropy of these variables.


See also

Entropy

This entry contributed by Erik G. Miller

Explore with Wolfram|Alpha

References

Cover, T. M. and Thomas, J. A. Elements of Information Theory. New York: Wiley, pp. 18-26, 1991.

Referenced on Wolfram|Alpha

Mutual Information

Cite this as:

Miller, Erik G. "Mutual Information." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein. https://mathworld.wolfram.com/MutualInformation.html

Subject classifications