A
dynamical system is a system which consists of a set of possible states and a rule which determines
future states from
past states.
It's a good way to use
math to model things from the
real world. A ball flying though the air makes a nice example. It's future state (where it's going and what it will be doing when it gets there) depends on it's current state (the direction it's headed, how fast, etc.). It's the foundation of the
idea that if you know how hard somebody thew the ball a minute ago, or how it was moving at any point after that, you can use a set of rules (either
Newtonian physics, or
your brain's circuitry) to figure out where it will end up, so that you can catch it.
Mathematically speaking, dynamical systems usually take the form of
discrete time maps or
differential equations.
A discrete time map is a dynamical system which works in
increments. It takes the conditions at some time t and gives the conditions at a later time, t+a. A good example of such a map is the
logistic map, which is a
population growth model:
n
t+1=n
tr(1-n
t)
(for more on this, follow the hard link)
Differential equations are the
continuous version of a dynamical system. They relate a system's rate of change to it's current state. The equation for the position of an object travelling at constant
velocity is a simple example.
x = x
0+vt
Where x is the current position, x
0 is the initial position (a
constant), and v is the velocity, or the rate of change of x (dx/dt). t is, of course, time. The equation relates the current state, x, to it's rate of change, v.
It's pretty easy to see that given one set of conditions to feed into the
equations, we could theoretically predict the behavior of the system
forever, by simply feeding our new conditions back into the equation. (This doesn't always work in practice though, see:
chaos theory). These results that we would get by repeated application of a map or the equivalent solving or estimation of a differential equation are called
orbits or
trajectories.