Not every sequence has a limit

One frustrating aspect of learning calculus or analysis is the realization that not every sequence has a limit. Sometimes it's for a prosaic reason: Any sequence with a limit is bounded -- so any unbounded sequence has not limit. Thus, limn→∞ (-1)n⋅n does not exist. By monkeying about with redefinitions we can say about some (or all) such sequences that their limit is ∞ (sometimes adding a sign for choice...), but that clearly takes us out of bounds: We end up officially outside of R or C, and cannot do arithmetic or many other things.

What about bounded sequences?

Many sequences have no limit. The simplest example is the sequence 0,1,0,1,... In fact, any nonconstant periodic sequence has no limit. Using notions of density or the Cesaro limit, we can say that the sequence 0,1,0,1,... has approximately half "1"s, or even that it has the Cesaro limit 1/2. This works well -- in fact, any periodic sequence has a Cesaro limit.

We even have Cesaro limits for more exotic (but bounded) sequences, like the Morse sequence. In other cases, like the Kolakoski sequence, we (still) cannot prove any density results -- so while we suspect its Cesaro limit is 3/2, we do not know it. But in fact, it is very easy to generate a bounded sequence with no Cesaro limit.

Other -- even weaker -- concepts of "limit" have been defined, for which more sequences have a "limit". But every maths undergrad has always dreamt of defining a limit for all bounded sequences.

Every bounded sequence has "a" Banach "LIMit"

A Banach limit (ab-?)uses some analysis and does away with most of the restrictions. Unfortunately, the price is pretty high.

Just 2 desirable properties

Any limit is a (continuous, but this being linear analysis that just means it's bounded) linear functional on some subspace of L. To an analyst (at rather, to analysts after Stefan Banach showed them how it's done) this suggests the use of the Hahn-Banach theorem.

Banach noticed the following 2 properties we'd want of any "limit" L:

  1. L(a,a,a,...)=a (the limit of a constant sequence is that constant);
  2. If we define the shift operator σ(x0, x1, x2, ...) = (x1, x2, ...), then for any x for which Lx is defined, L(σx)=Lx. (the limit of any sequence does not depend on any prefix of the sequence)

Why would we be justified in calling such an L a "limit"? For one thing, suppose the sequence x converges to some x*. Then (by definition) for any ε>0, there exists some Nε such that if n≥Nε then |xn-x*|<ε. But this just means that ||σNεx - (x*,x*,x*,...)||≤ε. And since L is bounded, we have that for any ε>0

|Lx - x*| = |LσNε - x*| ≤ c⋅ε
for some constant c=||L||. This means simply that Lx=x* -- Lx is the limit of any convergent sequence x.

It also turns out that the above 2 properties are enough to prove many desirable properties of L. For instance, if

x = (x1, x2, ..., xk, x1, x2, ..., xk, ...)
is some periodic sequence for which Lx is defined, then
Lx = L(σx) = L(σ2x) = L(σk-1x);
Adding all k together and writing s=x1+...+xk, we have that k⋅Lx = L(s,s,s,...) = s, so Lx = s/k is precisely the average value of the period of x. In particular, L(1,0,1,0,...)=1/2 if it is at all defined.

Definition / Existence

Our aim is now to get a linear functional LIM on L that satisfies (1) and (2) above. We'll be justified in calling such a functional LIM -- it generalises the concept of limit to all bounded sequences.

We start by defining a linear functional L on a subspace. Let c be the (1 dimensional) subspace of all constant sequences; we shall identify an element c∈c with its constant value, and write c=(c,c,...). And let

S = {x-σx : x∈L} ⊂ L
be another (infinite dimensional) subspace. It is easy to see that S∩c={0}. (In particular, S≠L).

Define a functional L on Sc by L(s+c)=c. L is well-defined since this is a direct sum. And clearly L satisfies conditions (1) and (2). We wish to use the Hahn-Banach theorem in order to extend L to a functional LIM on all of L. To do this, we just need to show that L is bounded.

So suppose that ||s+c|| ≤ 1 for some s=x-σx (x∈L) and constant c. We shall show that |c|≤1; this will prove that ||L||≤1. Suppose, to the contrary, that |c|≥1+d for some d>0. Since L is linear, WLOG we may assume that c is a positive real number (for we can take some |λ|=1 for which λc is a positive real number, and argue for λ⋅(s+c) instead). So c≥1+d>1.

From ||s+c||≤1 we have, marking x=(x0,x1,...), that:

|x0-x1+c| ≤ 1
|x1-x2+c| ≤ 1
...
|xn-xn+1+c| ≤ 1
...
From the first inequality we see that
x0 + d = x0+c-1 ≤ x1 (≤ x0+c+1 = x0+d).
Only the lower bound interests us, so we shall show only that. From the second iequality, we see that
x2 ≥ x1+c-1 = x1+d ≥ x0+2d.
Continuing, we see at every stage that
xn ≥ xn-1+c-1 = xn-1+d ≥ x0+(n-1)⋅d.

But this means that the sequence (x0,x1,...) is unbounded -- contradicting to our requirement that x∈L! The only possible conclusion is that indeed |c|≤1 if ||c+s||≤1, i.e. that ||L||≤1.

Now that we've seen L is bounded, we may apply the Hahn-Banach theorem, proving the existence of an extension LIM of L to all of L. We even see that -- as expected -- ||LIM||≤1.

Nonuniqueness

The proof of existence is somewhat unsatisfying: The Axiom of Choice (conveniently cloaked inside the Hahn-Banach theorem) was used to extend L from a "tiny" subset of L to all of it. Indeed, it is this very requirement that causes the most difficulties: the extension LIM above is not unique. Extensions are only unique when the original space is dense in the resulting space -- and S⊕c is very far from being dense in L.

Where does this Choice manifest itself? Obviously, for "generic" sequences of L -- the sequences we know are there (possibly even by the Axiom of Choice...), but have no idea what they look like.

On the other hand, above we saw that many values of are determined uniquely: the LIM of any periodic, convergent, or even Cesaro convergent sequence are determined uniquely, and obviously no Choice is involved.

However, there are still some important properties which can be proved. For instance, we can show that if ∀xn:a≤xn≤b then a ≤ LIMn xn ≤ b. As a result, we have that

lim infn xn ≤ LIMn xnlim supn xn.

The LIMit

Banach limits are -- by their very definition -- the widest possible definition of a limit. Because of properties (i)+(ii) above, any concept of limit will be subsumed by the Banach limit: it will agree with the Banach limit wherever it is defined, but the Banach limit may be defined when this other limit is not. Since any concept of (finite) limit necessarily requires that the sequence be bounded, the Banach limit is the biggest (finite) limit there can be.

On the other hand, this all-encompassing nature of Banach limits (sorry) limits their use. Whenever we can give an exact value for a Banach limit, we could probably have made done with a more limited definition of "limit". It's the ability to get a limit -- simultaneously -- to all bounded sequences that sets aside the Banach limit as (a bit) more than just a party trick of mathematical analysis.

Highly similar results to what you can get with a Banach limit are possible with the Alaoglu theorem -- which is also often more powerful.

Log in or register to write something here or to contact authors.