Markov Chain

马尔科夫链是一个具备马尔科夫性的随机过程。马尔科夫性是指:系统在下一步所处状态的条件概率仅与系统当前的状态相关,与系统以前的状态无关。

 

A Markov chain is a sequence of random variables X1, X2, X3, ... with the Markov property, namely that, given the present state, the future and past states are independent. Formally,

\Pr(X_{n+1}=x|X_1=x_1, X_2=x_2, \ldots, X_n=x_n) = \Pr(X_{n+1}=x|X_n=x_n).\,

The possible values of Xi form a countable set S called the state space of the chain. 

 一个典型的马尔科夫例子:

Another example is the dietary habits of a creature who eats only grapes, cheese or lettuce, and whose dietary habits conform to the following rules:

  • It eats exactly once a day.
  • If it ate cheese today, tomorrow it will eat lettuce or grapes with equal probability.
  • If it ate grapes today, tomorrow it will eat grapes with probability 1/10, cheese with probability 4/10 and lettuce with probability 5/10.
  • If it ate lettuce today, it will not eat lettuce again tomorrow but will eat grapes with probability 4/10 or cheese with probability 6/10.

 其实这与小时候玩的大富翁游戏很相似。

 to be continue...

 

 

 

posted @ 2012-10-07 16:57  SouthIsland  阅读(266)  评论(0编辑  收藏  举报