Jump to content

Markov property

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Michael Hardy (talk | contribs) at 22:20, 28 July 2004 (The future in a Markov process is NOT independent of the past; rather, it is CONDITIONALLY INDEPENDENT of the past, GIVEN the present state.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Informally, a stochastic process has the Markov property if the conditional probability distribution of future states of the process, given the present state, depends only upon the current state, and conditionally independent of the past states (the path of the process) given the present state. A process with the Markov property is usually called a Markov process, and may be described as Markovian.

Mathematically, if X(t), t > 0, is a stochastic process, the Markov property states that

Markov processes are typically termed (time-) homogeneous if

and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. Let X be a non-Markovian process. Then we define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically

If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.

An example of an non-Markovian process with a Markovian representation is a moving average time series.

The most famous Markov processes are Markov chains, but many other processes, including Brownian motion, are Markovian.

See also