Simulation and modeling - Old Questions

1.  Define and describe Markov chain in detail with the help of suitable examples. Also describe at least three areas of application of Markov chain.

10 marks | Asked in 2073

If the future states of a process are independent of the past and depend only on the present , the process is called a Markov process. A discrete state Markov process is called a Markov chain. A Markov Chain is a random process with the property that the next state depends only on the current state.

Markov chains are used to analyze trends and predict the future. (Weather, stock market, genetics, product success, etc.

Formally,

A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that, given the present state, the future and past states are independent.

The conditional probability above gives us the probability that a process in state in at time n moves to in+1 at time n + 1. We call this the transition probability for the Markov chain. If the transition probability does not depend on the time n, we have a stationary Markov chain, with transition probabilities

Now we can write down the whole Markov chain as a matrix P:

Example:

Weather Problem:

  • raining today  ⇒ 40 % rain tomorrow

                                60 % no rain tomorrow

  • not raining today  20 % rain tomorrow

                                      80 % no rain tomorrow

What will be probability if todays is not raining then not rain the day after tomorrow?

Markov chain diagram:

Transition matrix:

Thus, the probability of not rainy the day after tomorrow is 0.76.

Applications of Markov Chain

1. Physics: Markovian systems appear extensively in thermodynamics and statistical mechanics, whenever probabilities are used to represent unknown or unmodelled details of the system, if it can be assumed that the dynamics are time-invariant, and that no relevant history need be considered which is not already included in the state description.

2. Internet applications: The Page Rank of a webpage as used by Google is defined by a Markov chain. It is the probability to be at page i in the stationary distribution on the following Markov chain on all (known) web pages

3. Statistics: Markov chain methods have also become very important for generating sequences of random numbers to accurately reflect very complicated desired probability distributions, via a process called Markov chain Monte Carlo (MCMC) And many more.