TOPICS
Search

Relative Entropy


Let a discrete distribution have probability function p_k, and let a second discrete distribution have probability function q_k. Then the relative entropy of p with respect to q, also called the Kullback-Leibler distance, is defined by

 d=sum_(k)p_klog_2((p_k)/(q_k)).

Although d(p,q)!=d(q,p), so relative entropy is therefore not a true metric, it satisfies many important mathematical properties. For example, it is a convex function of p_k, is always nonnegative, and equals zero only if p_k=q_k.

Relative entropy is a very important concept in quantum information theory, as well as statistical mechanics (Qian 2000).


See also

Entropy

Explore with Wolfram|Alpha

References

Cover, T. M. and Thomas, J. A. Elements of Information Theory. New York: Wiley, 1991.Qian, H. "Relative Entropy: Free Energy Associated with Equilibrium Fluctuations and Nonequilibrium Deviations." 8 Jul 2000. http://arxiv.org/abs/math-ph/0007010.

Referenced on Wolfram|Alpha

Relative Entropy

Cite this as:

Weisstein, Eric W. "Relative Entropy." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/RelativeEntropy.html

Subject classifications