Simulation and Modelling - Old Questions

13.  Define and describe Markov Chain with example.

5 marks | Asked in 2076

If the future states of a process are independent of the past and depend only on the present , the process is called a Markov process. A discrete state Markov process is called a Markov chain. A Markov Chain is a random process with the property that the next state depends only on the current state.

Markov chains are used to analyze trends and predict the future. (Weather, stock market, genetics, product success, etc.

Formally,

A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that, given the present state, the future and past states are independent.

The conditional probability above gives us the probability that a process in state in at time n moves to in+1 at time n + 1. We call this the transition probability for the Markov chain. If the transition probability does not depend on the time n, we have a stationary Markov chain, with transition probabilities

Now we can write down the whole Markov chain as a matrix P:

Example:

Weather Problem:

  • raining today  ⇒ 40 % rain tomorrow

                                ⇒ 60 % no rain tomorrow

  • not raining today  ⇒ 20 % rain tomorrow

                                      ⇒ 80 % no rain tomorrow

What will be probability if todays is not raining then not rain the day after tomorrow?

Markov chain diagram:

Transition matrix:

Thus, the probability of not rainy the day after tomorrow is 0.76.