1 is not prime because a prime number has exactly two factors (no more, no less), namely itself and 1. Since 1 is itself, it can't have both itself and 1 as a factor, thus it is not prime. People tend to have a problem grasping this statement, much like that 0.9999... = 1.

Here's a better formula, that mathematicians actually use:

Let R be an integral domain (a commutative ring with unity where zero is the only zerodivisor). We say an element x is a unit iff x has a multiplicative inverse in R. We say that a non-unit (this is the important part) element p is prime iff, whenever p divides ab, p divides either a or b.

In the case of Z (the set of integers), for example, 2, 3, -5, 17, and -65537 are primes. Yes, the negatives of primes are also primes. In general, all the associates of a prime are prime (a is an associate of b if a = ub for some unit u).

Anyway, my point... whether or not you call 1 a prime has no implications in mathematics whatsoever. However, mathematics is about definitions, and mathematicians must agree on those definitions if they are to be able to communicate at all. If one author calls 1 a prime and another author does not, only confusion results. Not that this doesn't happen---I have seen a large number of mathematics texts say zero is not a natural number, and an even larger number say it is---but it still befits us to eliminate those incongruities and imprecisions whenever possible.

It happens that the above definition, which explicitly excludes units, is usually the most useful one. Why? Because there are a large number of theorems which hold for primes but not for units. `Prime' was chosen to not include units because saying `prime' in most cases and `prime or unit' in the few others is easier than saying `non-unit prime' in most cases and `prime' in a few.

It's easy to argue definitions about this. It's also pointless. 1 is NOT prime, but not just because a gang of mathematics teachers decided to define a prime number as a number with exactly 2 factors. After all, the same gang could just as well have decided to define a prime number as a number divisible only by 1 and by itself.

So why didn't they? Turns out they had a reason...

The reason mathematicians make definitions is so they can capture important concepts. A good definition is one which lets you state interesting theorems simply. And ensuring 1 is not prime makes all theorems easier to state.

For instance, the Fundamental Theorem of Arithmetic states that every natural number has a unique factorization as a product of prime numbers. If we look at 12, say, then 12=2*2*3, and there's no other way to write it as a product of prime numbers.

But if we accepted 1 as prime, we'd have infinitely many different factorizations!

12 = 2*2*3
12 = 1*2*2*3
12 = 1*1*2*2*3
12 = 1*1*1*2*2*3
...
Sure, we could exclude this case, by stating the theorem as "every natural number has a unique factorization as a product of prime numbers that are not 1".

But why bother? There's never a situation where it's convenient to consider 1 prime to shorten some definition. And the Fundamental Theorem is important enough that we want to define our concepts to match the objects that appear in it. After all, it is the fundamental theorem -- it contains the concepts we want to talk about. And those concepts are of "primes that aren't 1"...

Log in or register to write something here or to contact authors.