Law of large numbers

Jump to navigation Jump to search


The law of large numbers (LLN) is a theorem in probability that describes the long-term stability of a random variable. Given a sample of independent and identically distributed random variables with a finite population mean and variance, the average of these observations will eventually approach and stay close to the population mean.

The LLN can easily be illustrated using the rolls of a die. That is, outcomes of a multinomial distribution in which the numbers 1, 2, 3, 4, 5, and 6 are equally likely to be chosen. The population mean (or "expected value") of the outcomes is:

(1 + 2 + 3 + 4 + 5 + 6) / 6 = 3.5.

The following graph plots the results of an experiment of rolls of a die. In this experiment we see that the average of die rolls deviates wildly at first. As predicted by LLN the average stabilizes around the expected value of 3.5 as the number of observations become large.

A demonstration of the Law of Large Numbers using die rolls.

The law of large numbers works equally well for proportions. Given repeated flips of a fair coin, the frequency of heads (or tails) will approach 50% over a large number of trials. However, note that the absolute difference in the number of heads and tails won't necessarily get smaller. For example, we may see 520 heads after 1000 flips and 5096 heads after 10000 flips. While the difference has increased from 20 to 96, the average has moved from .52 to .5096, closer to the true 50%.

The LLN is important because it "guarantees" stable long-term results for random events. For example, while a casino may lose money in a single spin of the American Roulette wheel, it will almost certainly gain very close to 5.3% of all gambled money over thousands of spins. Any winning streak by a player will eventually be overcome by the parameters of the game. It is important to remember that the LLN only applies (as the name indicates) when a large number of observations are considered. There is no principle that a small number of observations will converge to the expected value or that a streak of one value will immediately be "balanced" by the others. See the Gambler's fallacy.

History

Jacob Bernoulli first described the LLN as so simple that even the stupidest man instinctively knows it is true. [1] Despite this, it took him over 20 years to develop a sufficiently rigorous mathematical proof which was published in Ars Conjectandi (The Art of Conjecturing) in 1713. He named this his "Golden Theorem" but it became generally known as "Bernoulli's Theorem" (not to be confused with the Law in Physics with the same name.) In 1835, S.D. Poisson further described it under the name "La loi des grands nombres" (The law of large numbers)[2]. Thereafter, it was known under both names, but the "Law of large numbers" is most frequently used.

After Bernoulli and Poisson published their efforts, other mathematicians also contributed to refinement of the law, including Chebyshev, Markov, Borel, Cantelli and Kolmogorov. These further studies have given rise to two prominent forms of the LLN. One is called the "weak" law and the other the "strong" law. These forms do not describe different laws but instead refer to different ways of describing the convergence of the observed or measured probability to the actual probability, and the strong form implies the weak.

Forms

Both versions of the law state that the sample average

<math>\overline{X}_n=\frac1n(X_1+\cdots+X_n) </math>

converges towards the expected value

<math>\overline{X}_n \, \to \, \mu \qquad\textrm{for}\qquad n \to \infty</math>

where X1, X2, ... an infinite sequence of i.i.d. random variables with finite expected value E(X1)=E(X2) = ... = µ < ∞.

An assumption of finite variance Var(X1) = Var(X2) = ... = σ2 < ∞ is not necessary. Large or infinite variance will make the convergence slower, but LLN holds anyway. This assumption is often used because it makes the proofs easier and shorter.

The difference between the strong and the weak version is which kind of convergence we are talking about.

The weak law

The weak law of large numbers states that the sample average converges in probability towards the expected value

<math>\overline{X}_n \, \xrightarrow{P} \, \mu \qquad\textrm{for}\qquad n \to \infty.</math>

That is to say that for any positive number ε,

<math>\lim_{n\rightarrow\infty}\operatorname{P}\left(\left|\overline{X}_n-\mu\right|<\varepsilon\right)=1.</math>

(Proof)

Interpreting the convergence in probability, the weak law essentially states that the average of many observations will eventually be close to the mean within any nonzero margin specified, no matter how small.

This version is called the weak law because convergence in probability is weak convergence of random variables.

A consequence of the weak LLN is the asymptotic equipartition property.

The strong law

The strong law of large numbers states that the sample average converges almost surely to the expected value

<math>\overline{X}_n \, \xrightarrow{\mathrm{a.s.}} \, \mu \qquad\textrm{for}\qquad n \to \infty .</math>

That is,

<math>\operatorname{P}\left(\lim_{n\rightarrow\infty}\overline{X}_n=\mu\right)=1,</math>

The proof is more complex than that of the weak law. This law justifies the intuitive interpretation of the expected value of a random variable as the "long-term average when sampling repeatedly".

This version is called the strong law because almost sure convergence is strong convergence of random variables. The strong law implies the weak law.

References

  1. Jakob Bernoulli, Ars Conjectandi: Usum & Applicationem Praecedentis Doctrinae in Civilibus, Moralibus & Oeconomicis, 1713, Chapter 4,(Translated into English by Oscar Sheynin)
  2. Hacking, Ian. (1983) "19th-century Cracks in the Concept of Determinism"
  • Grimmett, G. R. and Stirzaker, D. R. (1992). Probability and Random Processes, 2nd Edition. Clarendon Press, Oxford. ISBN 0-19-853665-8.
  • Richard Durrett (1995). Probability: Theory and Examples, 2nd Edition. Duxbury Press.
  • Martin Jacobsen (1992). Videregående Sandsynlighedsregning (Advanced Probability Theory) 3rd Edition. HCØ-tryk, Copenhagen. ISBN 87-91180-71-6.

See also

External links

bg:Закон за големите числа da:Store tals lov de:Gesetz der großen Zahlen eo:Leĝo de grandaj nombroj ko:큰 수의 법칙 it:Legge dei grandi numeri he:חוק המספרים הגדולים nl:Wetten van de grote aantallen no:Store talls lov simple:Law of large numbers sv:De stora talens lag ur:کثیر اعداد کا قانون

Template:WS