Any Goodstein's sequence (ie. starting with any n and b positive integers) converges back to zero in a finite number of steps.

Kirby and Paris proved that this theorem can only be proved using the concept of an infinite number and thus proved the necessity of the concept of infinity in mathematics.

It's an example of a Godel's theorem for induction. Given the correctness of mathematical induction, we must believe Goodstein's theorem even though it cannot be proved by mathematical induction.

A fairly easy proof of Goodstein's theorem (every Goodstein's sequence reaches 0) is possible using (countable) ordinal arithmetic. However, Goodstein's theorem cannot be proved in arithmetic.

Gödel's Incompleteness Theorem tells us that such propositions exist. But the example constructed by Gödel essentially says "this statement cannot be proven (in arithmetic)". This use of self-reference is somewhat suspect; indeed, some authors (notably Douglas Hofstadter in Gödel, Escher, Bach: an Eternal Golden Braid) consider self-reference a vital component of any such "Gödel sentence".

And let's face it: self-referential statements are more a curiousity than serious mathematics. If Gödel's theorem were serious, it would apply to statements of independent interest.

Hence the interest in "natural incompleteness results" such as Goodstein's theorem: it is a genuinely interesting statement (we can frame it as claiming that some specified program halts) that cannot be proven using only arithmetic. And there is no self-referential trickery involved in the formulation of the theorem (or in the proof (in Set theory, not arithmetic, of course!)). Gödel's theorem really does apply to "interesting" problems, not just some forgotten dark alleys of Mathematics!

See a sketch of a Set Theory proof that every Goodstein's sequence reaches 0.

Log in or register to write something here or to contact authors.