TOPICS
Search

Differential Entropy


Differential entropy differs from normal or absolute entropy in that the random variable need not be discrete. Given a continuous random variable X with a probability density function f_X(x), the differential entropy h(X) is defined as

 -int_(-infty)^inftyf_X(x)lnf_X(x)dx=-<lnf_X(x)>.
(1)

When we have a continuous random vector X that consists of n random variables X_1, X_2, ..., X_n, the differential entropy of X is defined as the n-fold integral

h(X)=-int_(-infty)^inftyf_(X)(x)lnf_(X)(x)dx
(2)
=-<lnf_(X)(x)>,
(3)

where f_(X)(x) is the joint probability density function of X.

Thus, for example, the differential entropy of a multivariate Gaussian random variate X with covariance matrix P is

h(X)=1/2ln[(2pie)^n|det(P)|]
(4)
=1/2n[1+ln(2pi)]+1/2ln|det(P)|.
(5)

Additional properties of differential entropy include

 h(X+c)=h(X),
(6)

where c is a constant and

 h(aX)=h(X)+ln|a|,
(7)

where a is a scaling factor and X is a scalar random variable. The above property can be generalized to the case of a random vector X premultiplied by a matrix A,

 h(AX)=h(X)+ln|det(A)|,
(8)

where det(A) is the determinant of matrix A.


See also

Entropy

Portions of this entry contributed by Marwan A. Mattar

Portions of this entry contributed by Matthew R. Rudary

Explore with Wolfram|Alpha

References

Cover, T. M. and Thomas, J. A. Elements of Information Theory. New York: Wiley, 1991.

Referenced on Wolfram|Alpha

Differential Entropy

Cite this as:

Mattar, Marwan A.; Rudary, Matthew R.; and Weisstein, Eric W. "Differential Entropy." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/DifferentialEntropy.html

Subject classifications