摘要:
A Markov chain is a stochastic process where we transition from one state to another state using a simple sequential procedure. We start a Markov chai 阅读全文
posted @ 2016-02-26 21:14
chaseblack
阅读(264)
评论(0)
推荐(0)

浙公网安备 33010602011771号