Consider a continuous function f:G→RN for a non-empty subset G⊂RN and an autonomous differential equation dx/dt=f(x(t)), hereafter x'=f(x), with an equilibrium point 0∈G - that is, f(0)=0.

The equilibrium 0 is described as being stable in the sense of Lyapunov if ∀ε>0 ∃δ>0 such that

If x:I→G is a maximal solution of the differential equation with 0∈I satisfying ||x(0)||<δ
then [0,∞)⊂I
and ||x(t)||≤ε ∀t≥0.

Say what?

A differential equation describes the way in which a system of variables changes with respect to change in another variable, usually time. But, it may be that what is of particular interest is preventing change- that is, examining conditions under which there is no change. In the formalisation above, the assertion that 0 is an equilibrium point (sometimes called a critical point) means that our system will, on attaining a value of zero, stay there.

The question of stability then arises when we seek to examine what happens near, but not at, the equilibrium. Is it only 0 that stays settled, or are there other initial conditions that are somehow drawn to a steady state, either that of 0 or another? If not, can we at least guarantee that nothing too outlandish happens whilst only a short distance away from the equilibrium?

If Lyapunov stability is achieved, then the system, whilst not necessarily settling on any one value, stays bounded, and can be described for all times. Moreover, if we have a particular bound in mind (the ε), then there is a restriction on initial condition (being within δ of 0) that will ensure we don't exceed that bound.

Lyapunov's direct method

With the setup as described in the opening paragraph (autonomous system, continuity of f, 0 an equilibrium point), there is a relatively straightforward test for this form of stability.

Lyapunov Stability Theorem: If there is an open neighbourhood U of 0 contained in G, and there exists a continuously differentiable function V from that neighbourhood to the reals satisfying
  • V(0)=0
  • V(z)>0 ∀ z ∈ U\{0}
  • Vf(z):=<∇V(z),f(z)> ≤0 ∀ z ∈ U - where < · , · > denotes inner product
Then, 0 is a stable (in the sense of Lyapunov) equilibrium.

Proof of the Lyapunov Stability Theorem

We need only consider ε sufficiently small that the closed epsilon-ball about 0 is contained in the set U, since we can satisfy any larger bound by satisfying such an epsilon. Moreover, such an epsilon exists since U is open and contains 0 and thus must contain some open neighbourhood of it. Thus we restrict our attention to

B:= { z ∈ RN st ||z|| ≤ ε } ⊂ U
S:= ∂B = { z ∈ RN st ||z|| = ε }

Now, the boundary S is closed and bounded, that is, compact, and V takes strictly positive values on S (since the only place it takes a value of zero is at 0, and it is never negative). V is continuous, and such functions achieve their extrema on compact sets. Thus, μ, the minimum of V(z) for z in S is strictly greater than zero.

μ > 0

Since V does attain a value of 0 at 0, there must be some neighbourhood of 0 where V takes (positive) values less than μ, that is, since V(0)=0,

∃δ∈(0,ε) such that ||z|| < δ ⇒ V(Z) = V(Z)- 0 = |V(z)-V(0)| < μ

Given x:I→G a maximal solution of x'=f(x) with 0∈I and ||x(0)||≤δ, we claim that [0,∞) ⊂ I with ||x(t)||≤ε for all t≥0. Were this true, the theorem would hold. Let ω be the supremum of I and suppose (for contradiction) the assertion were not true. Then it must fail- at some time τ ∈ (0,ω)⊂I, x escapes the bound, that is, ||x(τ)||>ε.

Since initially x is less than ε, and at τ is greater than epsilon, continuity and the intermediate value theorem force the existence of a σ∈(0,τ) such that ||x(σ)||=ε. Further, we can chose the earliest such τ, so for all t∈[0,τ), ||x(t)||<ε.

Note by the chain rule that Vf(t) is dV(x(t))/dt, and Vf is assumed to be less than or equal to 0 everywhere in U, so in particular

dV(x(t))/dt ≤ 0 ∀t∈[0,σ]

This means, however, that t→V(x(t)) is a non-increasing function on [0,σ], so

V(x(σ)) ≤ V(x(0))

But x(σ) is a point on S since ||x(σ)||=ε, so it is clearly at least the minimum on that set, so we obtain

μ ≤ V(x(σ)) ≤ V(x(0))

But this is no good; ||0||≤δ so V(0)<μ. Adding that in gives us

μ ≤ V(x(σ)) ≤ V(x(0)) < μ
or simply μ<μ

Which is absurd. So, ||x(t)||≤ε for any t∈[0,ω). By a property of maximal intervals of existence and compact sets which would be too much of a detour to prove here, this forces ω to be ∞, so we have the desired result of [0,∞)⊂I. This completes the proof (in suitable handwaving fashion).

Lyapunov Functions

A function satisfying the criteria of the theorem is known as a Lyapunov function. Given the above theorem, the challenge (as Swap points out, often considerable) is to construct such a function for a given system x'=f(x), but the reward is that finding any one such function suffices to ensure Lyapunov stability.

Asymptotic Stability

Lyapunov stability is not a very strong stability requirement; it does not mean that solutions 'near' to the equilibrium are 'pulled' to the equilibrium over time. This stronger notion is asymptotic stability and requires an additional notion of attractivity (which does not itself imply stability). However, to tackle this concept requires a deeper notion of invariant sets and local flows, whilst enforcing a local Lipschitz condition on f, so I leave it for another node.


Reference:MA40062 ODEs (University of Bath), my revision notes.