The difference between the "Law" of Averages and the Law of Large Numbers

These are two different mathematical concepts. At first glance they appear to say the same thing, but the difference, while subtle, is of gigantic significance.

Suppose you flip a coin 10 times and it lands on heads every time.

The Law of Averages states that you are suddenly more likely to start flipping tails. The universe will magically detect this earlier "anomaly", this glut of extra heads, and go, "Hey! We have too many heads here, and they must be balanced out!" Then, the universe will alter probability itself in order to make your coin more likely to land tails, so as to correct the imbalance.

After you toss the coin another 9990 times, you will end up with roughly 5000 heads and 5000 tails; the extra 10 heads at the start have no effect.

Clearly, this is impossible. Each flip is a distinct trial, with a 50% chance of heads and 50% chance of tails. The odds do not change.

The Law Of Large Numbers states - in unambiguous mathematical terms - that, if you continue to toss your coin, the universe will NOT try to balance out the 10-heads "anomaly". Rather, the new results will make that anomaly insignificant.

On average, with the 9990 coin tosses, you can expect to get 4995 heads and 4995 tails. The final result, then, is an average of 5005 heads and 4995 tails. Not 5000 of each.

The extra 10 heads at the start do still have an effect. The Law of Large Numbers simply states that as time goes on, this effect becomes less and less significant. Clearly, a 5005:4995 ratio is approximately 5000:5000. But that tiny skew will never go away completely. It will simply become undetectable.