Bayes for Beginners: Probability and Likelihood 好好看,非常有用。



Probability是指在固定参数的情况下,事件的概率,必须是0-1,事件互斥且和为1. 我们常见的泊松分布、二项分布、正态分布的概率密度图描述的就是这个。





Likelihood function

Consider a simple statistical model of a coin flip, with a single parameter p_\text{H} that expresses the "fairness" of the coin. This parameter is the probability that a given coin lands heads up ("H") when tossed. p_\text{H} can take on any numeric value within the range 0.0 to 1.0. For a perfectly fair coin, p_\text{H} = 0.5.

Imagine flipping a coin twice, and observing the following data : two heads in two tosses ("HH"). Assuming that each successive coin flip is IID, then the probability of observing HH is

{\displaystyle P({\text{HH}}\mid p_{\text{H}}=0.5)=0.5^{2}=0.25.}

Hence: given the observed data HH, the likelihood that the model parameter p_\text{H} equals 0.5, is 0.25. Mathematically, this is written as

{\displaystyle {\mathcal {L}}(p_{\text{H}}=0.5\mid {\text{HH}})=0.25.}

This is not the same as saying that the probability that p_\text{H} = 0.5, given the observation HH, is 0.25. (For that, we could apply Bayes' theorem, which implies that the posterior probability is proportional to the likelihood times the prior probability.)

Suppose that the coin is not a fair coin, but instead it has {\displaystyle p_{\text{H}}=0.3}. Then the probability of getting two heads is

{\displaystyle P({\text{HH}}\mid p_{\text{H}}=0.3)=0.3^{2}=0.09.}


{\displaystyle {\mathcal {L}}(p_{\text{H}}=0.3\mid {\text{HH}})=0.09.}

More generally, for each value of p_\text{H}, we can calculate the corresponding likelihood. The result of such calculations is displayed in Figure 1.

In Figure 1, the integral of the likelihood over the interval [0, 1] is 1/3. That illustrates an important aspect of likelihoods: likelihoods do not have to integrate (or sum) to 1, unlike probabilities.


posted @ 2018-06-25 19:23 Bioinformatics 阅读(...) 评论(...) 编辑 收藏