Current Issue

This Article From Issue

September-October 1998

Volume 86, Number 5

Randomness. Deborah J. Bennett. 224 pp. Harvard University Press, 1998. $22.95.


Chances are high that reading this book will clear up your misconceptions about randomness and probabilities. In this very entertaining little book, simply written but intended for careful readers, some of the most common mistakes people make about chance are carefully analyzed. While describing interesting aspects of the mathematics of probability, the author takes frequent detours into the history of humanity's understanding (and misunderstanding) of the laws of chance, touching on subjects as diverse as chance in decision-making and the fairness of those decisions, gambling and our intuitive understanding of chance, the likelihood of the extremely rare, the existence of true randomness and how computers have helped shape modern thinking about probabilities.

Ad Right

Imagine you are in a dark room and need to get a pair of matching socks out of a drawer. There are two blue socks and one red sock. If you take two socks, one after the other, what are the odds of getting two matching (blue) socks compared with getting a mismatch? The answer is that the chance of getting mismatching socks is double that of getting the matching socks. Isn't this obvious? Perhaps not, but reading the book will certainly make it crystal clear to anyone.

Author Deborah Bennett offers a number of similar examples of how our intuition about the odds of things happening is almost always wrong. For instance, what are the odds of a meteorite strike being the cause of the crash of TWA Flight 800 in July 1996? Does 1 in 10 sound right? Or is it more like 1 in 1 million? Is it really a 1 in 17 trillion coincidence that the same person wins the New Jersey lottery twice within four months?

The coin-toss problem or the roulette red-black dilemma are the mathematician's favorite examples of how deeply ingrained in our psyche is the idea that previous outcomes somehow influence future ones in a game of chance, or in life. Everyone knows that the chances of heads or tails are equally likely in a coin flip. However, not everyone takes this idea seriously enough. Say for instance that, flipping a coin many times, you have overcome great odds and have flipped 100 consecutive heads. What are the chances of the next flip being tails? More than 50-50, or just 50-50? Most people would expect the next flip to be tails more likely than heads and would even bet the farm that black will follow 100 consecutive reds. Yet they are wrong; the odds are still the same as they always are in these yes-or-no situations: 50-50—provided, of course, the coin and the roulette are fair.

An insightful chapter is "Chance or Necessity?" The question is very, very old (determinism versus chaos), and the answer is not clear even today. The author describes the problem beautifully: "Is a random outcome completely determined, and random only by virtue of our ignorance of the most minute contributing factors?" Einstein grappled with this conundrum until his death and never ceased to combat the idea that God could conceivably throw dice.

How do you generate randomness in a computer? What does it mean to have a program to generate random numbers? Aren't computer programs deterministic things, created by people following rules and thus following patterns? And isn't randomness the negation of pattern? Bennett begins this fascinating topic with a tongue-in-cheek quotation by R. R. Coveyou, "The generation of random numbers is too important to be left to chance." In the two chapters before the end, the author reminds us again and again of the impossibility of a gambling system by means of which a gambler can change his long-run frequency of success. The house always wins, or casinos would cease to exist. Yet, although we can understand these things in principle, we keep going to the gambling house in the hope that somehow these rules do not apply to us!

Whether well-educated in mathematics or not, people have always been fascinated by randomness and intrigued by the fundamental question of the real nature of randomness, of how can you tell randomness from something that is not. The theory of deterministic chaos tells us that a simple, deterministic rule can produce a behavior that is, for all practical purposes, indistinguishable from random. For instance, the logistic map Xn+1=4Xn (1-Xn) produces a sequence of random numbers as the equation is iterated and n increases (start for instance with X0=0.3, then calculate X1, X2, X3 and so on). If such a simple rule generates a list of numbers that apparently follow no rule, could it be that what we call random is in fact produced by (hidden) deterministic rules that happen to exhibit stochastic (chaotic) behavior? If so, does this mean that we will eventually find the pattern behind all randomness, as Einstein wanted to believe we would?—J. A. Rial, Geology, University of North Carolina at Chapel Hill

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.