Markov chain
A Markov chain is a stochastic process that evolves over time according to a sequence of random states, where the probability of moving to the next state depends only on the current state, not on the past history. This property is called the Markov property. So in particular is a Markov process.
A Markov chain is given by:
- A state space
(finite or countable). - A transition probability matrix
, where
Key Ideas
- Markov property:
- The distribution of states after
steps is given by matrix powers:
where
Examples
- Random walk on integers: move left or right with given probabilities.
- Weather model: states = {sunny, rainy}; transitions encode probabilities of changing weather.