The set of all the integers greater than or equal to zero. (0,1,2,3,...) Usually written 'N' with a blackboard bold font.

Editor's note: ℕ

Natural numbers: {1,2,3,4,...} The set of natural numbers, also known as the counting numbers is equal to the set of all positive integers.

Contrary to popular belief, zero is not a natural number. A good trick to remember this is to recall the phrase, It is not natural to start counting with zero.

The above inductive definition is not correct because step one states that zero is a natural number. Since zero is not positive (nor negative) is cannot be a natural number.

I am not familiar with Peano's postulate by I am guessing that this is an over-simplified version of it and I am also willing to bet that it goes something a little more like this (this is also over-simplified):

  1. 1 is a natural number because it is a positive (has value greater than zero) integer
  2. assume for some n=k, k is a natural number (where k>1)
  3. prove k+1 is a natural number
hence all integers greater than 1 are natural numbers.

After further research I have discovered that there are indeed two definitions for natural numbers and it suggested that when refering to number sets one should use the terms integer, positive integer, and nonnegative integer instead of natural number and counting number.

The cardinality (which is the number of elements in it) of the set of natural numbers is aleph null. 2^aleph null is c, which is the cardinality of the set of real numbers. Unlike the reals, this set is denumerable, which is the same as saying the cardinality of this set is the same as the set of integers. This is the sort of thing which is part of a set theory class in school.

The natural numbers are the basic starting kit of mathematics, the numbers 0, 1, 2, 3, 4, ...

The set {0, 1, 2, 3, ...} of all the natural numbers is given the symbol N, in a hollow typeface. The same typeface is used for the progressively larger sets Z (or J) of integers, Q of rational numbers, R of real numbers, and C of complex numbers.

Objection! Overruled!

It is often objected that zero isn't "natural" and so shouldn't count as a natural number. The answer to that objection is that mathematicians define things as they find convenient, and it's convenient to include 0. The set {0, 1, 2, 3, ...} is on the whole a more useful set to work with than {1, 2, 3, ...}. We call things natural number, real number, imaginary number, and these are just names, with some historical reason for the choice of name; but it doesn't mean that they're especially natural, real, or imaginary in any intuitive or metaphysical way. So, to recap, N starts with 0, as it is usually defined by most mathematicians.

The natural numbers are also sometimes called the counting numbers, which is a real pity, as it would be natural to use "counting" for the set starting with 1.

When the natural numbers are regarded as a subset of the integers, we can call {1, 2, 3, ...} the positive integers, symbol Z+, and {0, 1, 2, ...} the nonnegative integers, symbol Z+ ∪ {0}. But we're getting ahead of ourselves.

The natural numbers provoke a fair bit of heat among (some, anyway) mathematicians of philosophical bent and/or philosophers of mathematical bent: especially the ones from 1 onward. Most of us feel that either we do know what the numbers 1, 2, 3... are, or that we should know, and if we don't we need to find out or decide the answer.

A brief history...

Leopold Kronecker, back in the nineteenth century, made an observation along the lines that God made the natural numbers and Man made the rest. Kronecker was an intuitionist. He might have said integers, I forget, but I'm guessing he meant counting numbers. The general idea, anyway, is that there really are ones, twos, and threes of things out there in the world, and all the rest is ideas that we invent.

Gottlob Frege, end of nineteenth century, thought 3 was the... okay, I'm always a bit shaky on what Frege actually thought, but it's something like... 3 is the property that all 3-element sets share. From this comes the abstract notion of the cardinality of a set.

Giuseppe Peano, turn of the century, came up with his five axioms (or postulates, I don't think the distinction is so important once we get into the age of formalism, as we are about to), by which we can define the natural numbers. Actually, since this is now a formal system, what the Peano axioms define is not the set of natural numbers, but a set of things which behave like natural numbers in abstract ways. These Peano objects (if I may so call them) are sufficient for the task: all mathematical truths about N can be translated into Peano language and corresponding truths hold about Peano objects.

  • There is an element 0 ∈ N
  • There is a successor operation S. If a ∈ N then there exists unique Sa ∈ N.
  • No two distinct elements have the same successor. If c = Sa and c = Sb then a = b.
  • 0 is not the successor of any element.
  • Mathematical induction. If some property holds for 0, and if that property holds for Sa whenever it holds for a, then it holds for all elements of N.

From these definitions we can construct arithmetic and number theory, after defining a bit more notation: for S0 we write 1, for Sa we write a + 1, and so on. All we need to do is add the negative numbers to complete the integers, because natural numbers are integers, aren't they?

Aren't they?

Cough. It's quite nicely painted, this corner. The part about 0 not being the successor of anything is a problem. The Peano numbers formalization was how natural numbers were absorbed into the grand project of the Zermelo-Fraenkel axioms, in which all of mathematics was derived from set theory, starting from 0 defined to be the empty set, and proceeding through 1 upward in a technical operation described fully under set theory, which has the interesting but not exactly intuitive property that 3 = {0, 1, 2} and so on.

What we have to do is embed these in a larger set. It doesn't matter what precisely, it's only a formalism, find a couple of things you haven't used yet, and define minus_sign = {Cookie Monster} and plus_sign = {Kermit the Frog} then define -2 = {minus_sign, 2} and +2 = {plus_sign, 2}. The point I'm making, with a lot of hand waving, is that it's not absolutely automatic and god-given that the natural number 2 is the same thing as the integer +2.

Similarly, we think of the integer +2 as being exactly the same as the rational number called 2, and the real number 2, and the complex number 2 + 0i, but we don't have to. Each of these sets is constructed differently. We might insist they're the same, and say that the formal definition that makes them separate is therefore at fault, but we don't have to. Perhaps we make the integers.

Probably full of little typos. /msg to usual address. Oh lord, my finger pauses over the buttons... (thing) or (idea)?

Log in or register to write something here or to contact authors.