Jakob Bernoulli, head of a dynasty of brilliant scholars, was one of the world's leading mathematicians. His great work, Ars Conjectandi, published in 1713, included a profound result that he established "after having meditated on it for 20 years".
He called it his “golden theorem”. It is known today as the law of large numbers and it was the first limit theorem in probability, and the first attempt to apply probability outside the realm of games of chance.
Games of chance
If a fair coin is tossed 10 times, what are the chances of four heads and six tails? Bernoulli answered questions like this, formulating the binomial distribution, a workhorse of probability and statistics, in the process. The proportion of heads converges to one-half as the number of trials increases. By performing many trials, we increase the level of confidence to arbitrary precision: the more trials, the greater the degree of certainty.
The golden theorem provides a link between theoretical probability and observed frequency. It says nothing about the results of an individual experiment, but it enables us to predict average behaviour over the long term. The results are not confined to games of chance: the theorem has vital applications in many fields, such as insurance, economics and drug testing.
Bernoulli’s urn
A Bernoulli trial is an experiment with two possible outcomes: tossing a coin, drawing a card to get an ace or casting a die to get six. Each repetition is independent of the others and Bernoulli showed how many repetitions are required to ensure a given level of certainty. Using his theorem, we can estimate probabilities by means of experiments, with a confidence that can be given quantitatively.
Bernoulli proposed an experiment using an urn filled with unknown numbers of black and white pebbles. By repeatedly drawing a pebble from the urn, noting its colour and returning it to the urn, the proportion of white pebbles drawn approximates the actual fraction in the urn. Bernoulli observed that “what we are not given a priori, we can at least obtain a posteriori”.
The gambler’s fallacy
Gamblers have been known to misinterpret probabilities. In the summer of 1913, black came up 26 times in a row at the roulette table in Monte Carlo. As the streak continued, gamblers lost millions betting on red, believing that the long run of blacks increased the chances of red coming up.
The gambler’s fallacy is based on the belief that an event that has occurred less frequently than normal is more likely to happen soon. The fallacy involves the idea that luck comes in streaks: an excess of one outcome will be counterbalanced by a deficit in the following trials. In reality, the law of large numbers ensures only that, over time, the imbalance is diluted and the proportion gets closer to the expected value.
Upon first encounter, Bernoulli’s theorem seems almost self-evident, but it is profoundly subtle. Bernoulli’s arguments have been greatly simplified, and his law of large numbers has been taken up by many later mathematicians: De Moivre, Laplace, Poisson, Chebyshev, Markov and Kolmogorov.
A modern proof of the theorem occupies about one paragraph. A far more powerful version, the strong law of large numbers, was proved by Émile Borel in 1909. But all modern extensions spring from the brilliant insights of Bernoulli.
Peter Lynch is emeritus professor at UCD School of Mathematics & Statistics, University College Dublin – he blogs at thatsmaths.com