TOPICS
Search

Kolmogorov Entropy


Kolmogorov entropy, also known as metric entropy, Kolmogorov-Sinai entropy, or KS entropy, is defined as follows. Divide phase space into D-dimensional hypercubes of content epsilon^D. Let P_(i_0,...,i_n) be the probability that a trajectory is in hypercube i_0 at t=0, i_1 at t=T, i_2 at t=2T, etc. Then define

 K_n=h_K=-sum_(i_0,...,i_n)P_(i_0,...,i_n)lnP_(i_0,...,i_n),
(1)

where K_(N+1)-K_N is the information needed to predict which hypercube the trajectory will be in at (n+1)T given trajectories up to nT. The Kolmogorov entropy is then defined by

 K=lim_(T->0)lim_(epsilon->0^+)lim_(N->infty)1/(NT)sum_(n=0)^(N-1)(K_(n+1)-K_n).
(2)

The Kolmogorov entropy is related to Lyapunov characteristic exponents by

 h_K=int_Psum_(sigma_i>0)sigma_idmu.
(3)

The Kolmogorov entropy is 0 for nonchaotic motion and positive for chaotic motion.


See also

Hypercube, Lyapunov Characteristic Exponent

Explore with Wolfram|Alpha

References

Ott, E. Chaos in Dynamical Systems. New York: Cambridge University Press, p. 138, 1993.Schuster, H. G. Deterministic Chaos: An Introduction, 3rd ed. New York: Wiley, p. 112, 1995.

Referenced on Wolfram|Alpha

Kolmogorov Entropy

Cite this as:

Weisstein, Eric W. "Kolmogorov Entropy." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/KolmogorovEntropy.html

Subject classifications