摘要: Entropy, relative entropy and mutual information. Entropy $$ H(X) = -\sum_{x} p(x) \log p(x), $$ 熵非负, 且当且仅当$X$确定性的时候为有最小值0, 即$P(X=x_0)=1$. Proof: 由$\l 阅读全文
posted @ 2020-10-22 20:55 馒头and花卷 阅读(489) 评论(0) 推荐(0)