Counting in unary offers great simplicity and ease of use at the expense of efficiency. Whereas other counting systems try to put more information in each digit, and hence keep overall number length down, unary has the *absolute minimum* amount of information in each digit, meaning numbers get quite long when expressed using the system.

An example of this is an exercise we all did as children; learning to count to ten on your fingers. This is unary counting, as each number takes up one more digit (pun intended) than its predecessor. However, anyone can see this system is horribly inefficient (well, maybe not the 5-year old doing it). While counting up through the low numbers, most of our fingers are totally unused, wasting many of those god-given appendages:

one: 1
two: 11
three: 111
four: 1111
...
ten: 11111 11111

Contrast this with binary counting, where each finger encodes **a whole bit** of information:

one: 00000 00001
two: 00000 00010
three: 00000 00011
four: 00000 00100
...
ten: 00000 01010

As you can see, ten can be expressed with room to spare. In fact, you can count up to 31 *on only one hand* and 1023 when using both!

To make all this a bit more formal, every counting system except unary can express a number, n, in log(n) digits. The base of this logarithm is dependent on the specific system; it's base-2 for binary, base-8 for octal, etc. The important thing is that as a number gets larger, its representation in unary grows just as fast, whereas in any other system, the representation still grows, but more slowly.

So, unary is totally useless and should be phased out? I don't think so. It does have some useful aspects. For example, if we are counting something manually, it would be convenient to be able to simply append to our running total, rather than have to write out the entire new count each time we increment. Unary is the only counting system that offers this, and is used in the traditional "four lines then strikethrough" counting. Also, the simplicity of unary means it is very easy to reason about in mathematical environments. In both computation theory and complexity theory, inputs are often presented in unary for convenience.

However, these are fairly restricted uses of unary; decimal counting is the clear victor, and rightfully so. To express an the age of the universive in years takes only eleven digits in decimal: 13,000,000,000. If we assume about three digts per centimetre, that's exactly three and a bit centimetres. However, in unary, writing that same number would produce a string of ones long enough to stretch around the equator.