The Law of Averages, sometimes referred to as Maturity of Chances, is an invalid mathematical theory. This is the idea that the more often something happens, the less likely it is to happen again, and the longer something doesn't happen, the more likely it is to happen soon. This is pure hogwash. It is not a mathematical law, contrary to its name, and is in direct opposition to probability theory, which is sound mathematics.

Under the Law of Averages, if I were to flip a quarter 75 times, and it came up Heads the first 74 times, the coin should come up Tails the next time. To hear someone who believes in the Law of Averages talk, it would be inconceivable for it to come up Heads again! However, the probability remains the same as it was for the first 74 times: 50%.

You see, a coin doesn't have any memory. Neither do dice, or a roulette wheel, or any other gambling implement or situation. Each flip of the coin, or throw of the dice, or spin of the wheel, is independent of every previous flip, throw, or spin. But people continue to get locked into the confused method of thinking that since it came up Heads so many times, then, dammit, it's due for a Tails! As an experiment, flip a coin 1000 times, and keep track of the number of Heads results. The total should end up somewhere near 500, but not necessarily. I'd be greatly surprised to see it come up 500 exactly... and it could be heavily skewed in either direction. It may take ten thousand flips to get to "even." Or a million. Or it may well never achieve an exact balance.

Part of the problem that leads to this train of thought is a confusion between future odds and odds viewed from the middle of a sequence. The odds of rolling a 6 three times in a row on a standard 6-sided die is one in 216 (6 x 6 x 6). If a six comes up twice, the odds of it coming up a third time is still one in six... the previous two rolls are irrelevant to determining the third.