Markov chain

A Markov chain is a stochastic process that evolves over time according to a sequence of random states, where the probability of moving to the next state depends only on the current state, not on the past history. This property is called the Markov property. So in particular is a Markov process.

A Markov chain is given by:

pij=Pr(Xn+1=sjXn=si),jpij=1.

Key Ideas

Pr(Xn+1=sjXn,Xn1,,X0)=Pr(Xn+1=sjXn). π(n)=π(0)Pn,

where π(0) is the initial distribution.

Examples