泊松分布笔记

The probability mass function (PMF) of a Poisson distribution describes the probability of observing exactly k events in a fixed interval of time or space, given that these events occur with a known constant mean rate and independently of the time since the last event. The PMF of a Poisson distribution is given by:

\[ P(X=k) = \frac{{e^{-\lambda} \lambda^k}}{{k!}} \]

where:
- \( X \) is a Poisson-distributed random variable.
- \( P(X=k) \) is the probability that the random variable \( X \) takes on the value \( k \) (i.e., \( k \) events occur).
- \( \lambda \) is the average rate (mean number of events) in the interval.
- \( e \) is the base of the natural logarithm (approximately equal to 2.71828).
- \( k! \) is the factorial of \( k \).

To prove that this function is a valid PMF, we need to show two things:
1. The probabilities are non-negative for all k.
2. The sum of the probabilities for all possible values of k is 1.

**Proof for Non-Negativity:**
For any given \( k \geq 0 \), \( e^{-\lambda} \) is always positive because \( e \) raised to any real power is positive. \( \lambda^k \) is also always positive, regardless of \( k \), since \( \lambda \) is a rate and thus non-negative, and any real number raised to a non-negative power is non-negative. The factorial \( k! \) is always positive for non-negative integers. Therefore, for all non-negative integers \( k \), \( P(X=k) \geq 0 \).

**Proof that the Sum is 1:**
We need to show that the sum of probabilities over all possible values of \( k \) equals 1:

\[ \sum_{k=0}^{\infty} P(X=k) = \sum_{k=0}^{\infty} \frac{{e^{-\lambda} \lambda^k}}{{k!}} = e^{-\lambda} \sum_{k=0}^{\infty} \frac{{\lambda^k}}{{k!}} \]

The series \( \sum_{k=0}^{\infty} \frac{{\lambda^k}}{{k!}} \) is the Taylor series expansion of \( e^{\lambda} \), which converges absolutely for all real numbers \( \lambda \). Therefore:

\[ e^{-\lambda} \sum_{k=0}^{\infty} \frac{{\lambda^k}}{{k!}} = e^{-\lambda} e^{\lambda} = 1 \]

This confirms that the total probability mass equals 1, satisfying the second condition for a valid PMF.

Thus, this function satisfies both conditions required for a probability distribution function, proving that it is indeed the PMF of the Poisson distribution.

 

How does the poisson distribution comes out?

The Poisson distribution arises in the context of events that occur independently with a known constant mean rate, \( \lambda \), in a fixed interval of time or space. It is a limiting case of the binomial distribution as the number of trials \( n \) goes to infinity while the expected number of successes \( np \) remains fixed. Here's an outline of how the Poisson distribution comes about from the binomial distribution:

1. **Binomial Distribution:**
The probability of observing exactly \( k \) successes in \( n \) independent Bernoulli trials with success probability \( p \) is given by the binomial distribution:
\[ P(X = k) = {n \choose k} p^k (1-p)^{n-k} \]
where \( {n \choose k} \) is the binomial coefficient.

2. **Poisson Approximation to Binomial:**
When \( n \) is large, and \( p \) is small such that \( np = \lambda \) is fixed, the binomial distribution can be approximated by the Poisson distribution. This is because as \( n \) becomes large, the binomial coefficient and the terms involving \( p \) and \( 1-p \) can be approximated using limits.

3. **Taking Limits:**
If we take the limit as \( n \rightarrow \infty \) and \( p \rightarrow 0 \) while keeping \( np = \lambda \) fixed, we can derive the Poisson probability mass function from the binomial probability mass function.

In this limit, the binomial coefficient can be approximated as:
\[ {n \choose k} = \frac{n!}{k!(n-k)!} \approx \frac{n^k}{k!} \]
since \( (n-k)! \) becomes negligible compared to \( n! \) when \( k \) is fixed and \( n \) is large.

The term \( p^k \) becomes \( (\lambda/n)^k \), and the term \( (1-p)^{n-k} \) can be approximated using the limit definition of the exponential function:
\[ (1-p)^{n-k} = (1 - \frac{\lambda}{n})^{n-k} = (1 - \frac{\lambda}{n})^n (1 - \frac{\lambda}{n})^{-k} \approx e^{-\lambda} \]

4. **Combining Terms:**
Combining these approximations, we get:
\[ P(X = k) = {n \choose k} p^k (1-p)^{n-k} \approx \frac{n^k}{k!} (\frac{\lambda}{n})^k e^{-\lambda} = \frac{\lambda^k}{k!} e^{-\lambda} \]

5. **Poisson PMF:**
This is the Poisson probability mass function:
\[ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} \]
which gives the probability of observing exactly \( k \) events in a fixed interval, given a constant mean rate of occurrence, \( \lambda \).

The derivation is based on taking limits and making approximations that hold when \( n \) is large, and \( p \) is small. The Poisson distribution thus provides a good model for rare events over a fixed period or in a fixed space, such as the number of radioactive particles emitted from a source in a given time, or the number of phone calls received by a call center in an hour.

 

 

why n!/k!(n-k)! is approximately equals to n^k/k!?

The statement that \( n! \) is approximately equal to \( n^k(n-k)! \) is not generally true. However, in the specific context of approximating the binomial coefficient for large \( n \) and fixed \( k \), we can consider the highest \( k \) terms of \( n! \).

Let's break down \( n! \) (the factorial of \( n \)):

\[ n! = n \times (n - 1) \times (n - 2) \times \ldots \times (n - k + 1) \times (n - k)! \]

If \( n \) is very large compared to \( k \), the terms \( (n - 1), (n - 2), \ldots, (n - k + 1) \) are approximately equal to \( n \). So for the first \( k \) factors of \( n! \), we can write:

\[ n \times (n - 1) \times (n - 2) \times \ldots \times (n - k + 1) \approx n^k \]

This gives us:

\[ n! \approx n^k \times (n - k)! \]

This approximation is used in deriving the Poisson distribution from the binomial distribution when \( n \) is large. It simplifies the binomial coefficient:

\[ {n \choose k} = \frac{n!}{k!(n-k)!} \approx \frac{n^k}{k!} \]

This approximation holds because when \( n \) is large, the product of the \( k \) largest factors of \( n! \) dominates, and the remaining factors in \( (n-k)! \) do not significantly affect the value of the first \( k \) terms. It's a useful approximation for the derivation but should be used with an understanding of its limitations and context.

 

Then  \( \frac{n!}{k!(n-k)!} \approx \frac{n^k}{k!} \) . This approximation is often used in the context of the Poisson distribution derivation when \( n \) is very large and \( k \) is relatively small. Here's why this approximation is reasonable in that context:

The binomial coefficient \( {n \choose k} = \frac{n!}{k!(n-k)!} \) is the number of ways to choose \( k \) successes from \( n \) trials. When \( n \) is large and \( k \) is small compared to \( n \), the terms \( n, n-1, n-2, ..., n-k+1 \) in the numerator \( n! \) are approximately equal to \( n \) (since subtracting a small number from a very large number doesn't change it much).

So, for a small \( k \), we can approximate:

\[ n! \approx n^k (n-k)! \]

Now, dividing both sides by \( k!(n-k)! \), we get:

\[ {n \choose k} = \frac{n!}{k!(n-k)!} \approx \frac{n^k (n-k)!}{k!(n-k)!} = \frac{n^k}{k!} \]

This approximation becomes better as \( n \) grows large and \( k/n \) becomes small. It's important to note that this is a heuristic used for simplifying the derivation of the Poisson distribution from the binomial distribution under certain conditions, specifically when \( np = \lambda \) remains constant and \( p \rightarrow 0 \) as \( n \rightarrow \infty \).

 

 

Taylor series expansion of  \e^{-\lambda}\ :

The Taylor series expansion for the exponential function \( e^x \) at \( x=0 \) (also known as the Maclaurin series for \( e^x \)) is given by:

\[ e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!} \]

This means that for any real number \( x \), you can approximate \( e^x \) by summing up terms of the form \( \frac{x^n}{n!} \), where \( n! \) denotes the factorial of \( n \).

To prove this series expansion, we use the definition of the Taylor series for a function \( f(x) \) about \( x=0 \):

\[ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(0)}{n!} x^n \]

where \( f^{(n)}(0) \) denotes the \( n \)-th derivative of \( f(x) \) evaluated at \( x=0 \).

For the exponential function \( e^x \), all derivatives are equal to \( e^x \). Specifically, at \( x=0 \), all derivatives of \( e^x \) are equal to 1. Therefore, the Taylor series for \( e^x \) is:

\[ e^x = \sum_{n=0}^{\infty} \frac{1}{n!} x^n \]

Now, if we replace \( x \) with \( \lambda \), we get the series expansion for \( e^\lambda \):

\[ e^\lambda = \sum_{n=0}^{\infty} \frac{\lambda^n}{n!} \]

This is an infinite series that converges to \( e^\lambda \) for all real numbers \( \lambda \). The series converges absolutely and uniformly on every bounded interval of the real line. This proof is based on the properties of the exponential function and the fact that its derivatives are equal to the function itself, which is a unique property of the exponential function.

 

posted @ 2024-02-13 10:03  热爱工作的宁致桑  阅读(93)  评论(0)    收藏  举报