摘要:
Kullback–Leibler divergenceFrom Wikipedia, the free encyclopedia(Redirected fromRelative entropy)Inprobability theoryandinformation theory, theKullback–Leibler divergence[1][2][3](alsoinformation divergence,information gain,relative entropy, orKLIC) is a non-symmetric measure of the difference betwe 阅读全文
posted @ 2011-12-08 14:48
COS
阅读(1856)
评论(0)
推荐(0)
摘要:
Dirichlet distribution--deeply understandFrom Wikipedia, the free encyclopediaSeveral images of the probability density of the Dirichlet distribution whenK=3 for various parameter vectorsα. Clockwise from top left:α=(6,2,2), (3,7,5), (6,2,6), (2,3,4).Inprobabilityandstatistics, theDirichlet distribu 阅读全文
posted @ 2011-12-08 14:13
COS
阅读(892)
评论(0)
推荐(1)
浙公网安备 33010602011771号