2016年2月26日

摘要: A Markov chain is a stochastic process where we transition from one state to another state using a simple sequential procedure. We start a Markov chai 阅读全文
posted @ 2016-02-26 21:14 chaseblack 阅读(264) 评论(0) 推荐(0)
摘要: The application of probabilistic models to data often leads to inference problems that require the integration of complex, high dimensional distributi 阅读全文
posted @ 2016-02-26 18:19 chaseblack 阅读(156) 评论(0) 推荐(0)

导航