TOPICS
Search

Markov Process


A random process whose future probabilities are determined by its most recent values. A stochastic process x(t) is called Markov if for every n and t_1<t_2...<t_n, we have

 P(x(t_n)<=x_n|x(t_(n-1)),...,x(t_1)) 
 =P(x(t_n)<=x_n|x(t_(n-1))).

This is equivalent to

 P(x(t_n)<=x_n|x(t) for all t<=t_(n-1)) 
 =P(x(t_n)<=x_n|x(t_(n-1)))

(Papoulis 1984, p. 535).


See also

Doob's Theorem

Explore with Wolfram|Alpha

References

Bharucha-Reid, A. T. Elements of the Theory of Markov Processes and Their Applications. New York: McGraw-Hill, 1960.Papoulis, A. "Brownian Movement and Markoff Processes." Ch. 15 in Probability, Random Variables, and Stochastic Processes, 2nd ed. New York: McGraw-Hill, pp. 515-553, 1984.

Referenced on Wolfram|Alpha

Markov Process

Cite this as:

Weisstein, Eric W. "Markov Process." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/MarkovProcess.html

Subject classifications