Distributions: Chebyshev Inequality | Uniform | Cauchy | Normal/Gaussian

https://www.britannica.com/science/probability-theory
https://www.britannica.com/biography/David-Blackwell

Random Event:
Probability Function;
Distribution Function:
Expectation/Mean;
Deviation/Var:Variance/STD:Standard Deviation/CV: Coefficient of Variance/Degree of Liberty/Freedom

Chebyshev Inequality/Theorem:
https://www.britannica.com/science/Chebyshevs-inequality
Chebyshev’s inequality, also called Bienaymé-Chebyshev inequality, in probability theory, a theorem that characterizes the dispersion of data away from its mean (average). The general theorem is attributed to the 19th-century Russian mathematician Pafnuty Chebyshev, though credit for it should be shared with the French mathematician Irénée-Jules Bienaymé, whose (less general) 1853 proof predated Chebyshev’s by 14 years.
Chebyshev’s inequality puts an upper bound on the probability that an observation should be far from its mean. It requires only two minimal conditions:
(1) that the underlying distribution have a mean
(2) that the average size of the deviations away from this mean (as gauged by the standard deviation) not be infinite.
Chebyshev’s inequality then states that the probability that an observation will be more than k standard deviations from the mean is at most 1/k2. Chebyshev used the inequality to prove his version of the law of large numbers.
But, Since with virtually no restriction on the shape of an underlying distribution,
To anyone looking for a precise statement on the probability of a large deviation. To achieve this goal, people usually try to justify a specific error distribution, such as the normal distribution as proposed by the German mathematician Carl Friedrich Gauss. Gauss also developed a tighter bound, 4/9k2 (for k > 2/Square root of√3), on the probability of a large deviation by imposing the natural restriction that the error distribution decline symmetrically from a maximum at 0.
The difference between these values is substantial. According to Chebyshev’s inequality, the probability that a value will be more than two standard deviations from the mean (k = 2) cannot exceed 25 percent. Gauss’s bound is 11 percent, and the value for the normal distribution is just under 5 percent. Thus, it is apparent that Chebyshev’s inequality is useful only as a theoretical tool for proving generally applicable theorems, not for generating tight probability bounds.

  • Richard Routledge

Uniform Distribution:
U(a, b):

  • F(x) = x ·1/(b-a)
  • p(x) = 1/(b-a) if q<x<b; p(x) = 0 else.
  • E(x) = (a+b)/2

Cauchy Distribution:

  • F(x) = [arctan(x) + pi/2 ]·1/pi
  • p(x) = [1/(1+x^2)] · 1/pi
  • E(x) : non-exist

Gaussian/Normal Distribution:
https://www.britannica.com/topic/normal-distribution
normal distribution, also called Gaussian distribution, the most common distribution function for independent, randomly generated variables. Its familiar bell-shaped curve is ubiquitous in statistical reports, from survey analysis and quality control to resource allocation.

The graph of the normal distribution is characterized by two parameters: the mean, or average, which is the maximum of the graph and about which the graph is always symmetric; and the standard deviation, which determines the amount of dispersion away from the mean. A small standard deviation (compared with the mean) produces a steep graph, whereas a large standard deviation (again compared with the mean) produces a flat graph. See the figure.

posted @ 2023-08-16 17:33  abaelhe  阅读(48)  评论(0)    收藏  举报