The mutual information between two discrete random variables
and
is defined to be
 |
(1)
|
bits. Additional properties are
and
 |
(4)
|
where
is the entropy of the random
variable
and
is the joint entropy of these variables.
See also
Entropy
This entry contributed by Erik G.
Miller
Explore with Wolfram|Alpha
References
Cover, T. M. and Thomas, J. A. Elements of Information Theory. New York: Wiley, pp. 18-26, 1991.Referenced
on Wolfram|Alpha
Mutual Information
Cite this as:
Miller, Erik G. "Mutual Information." From MathWorld--A Wolfram Web Resource, created by Eric
W. Weisstein. https://mathworld.wolfram.com/MutualInformation.html
Subject classifications