A Markov process {X1, X2, ...} on the vertices of a graph G=(V,E) which moves from a vertex to a neighbour. So the process is completely determined by the probabilities

P{Xn+1=v | Xn=u} = p(u,v)
for all {u,v}∈E: if we're at the vertex u, we pick one of the neighbours v for our next step with probability p(u,v) independently of all other steps and go there.

If we are equally likely to go to any of the neighbours, the walk is the particularly important simple random walk (which see).

Random walks have been studied very extensively. Here are some easy comments.

For a finite graph G, this is a typical finite Markov process. Assume {u,v}∈E.p(u,v)>0. Then the walk is aperiodic iff G is not a bipartite graph, and the walk is irreducible iff G is connected. So it's an ergodic process iff ξ(G)≠2 (G is not bipartite) and G is connected.

For an infinite graph G, the process is perhaps less familiar. Notions of transience and recurrence are all easy to understand; long-term behaviour is generally understood, although some surprises exist (and more are expected in the less understood regions).

Hm, okay, well here are some even easier comments. A random walk is a really basic idea that doesn't require any understanding of the maths, and which comes up so commonly that it needs explanation in non-technical language.

The usual illustration is of a drunk weaving to and fro. Each step he takes is in a random direction. He might stagger three paces up the street, one down, two up, one down, and so on.

The important thing is that each step is random (and independent). The direction this time has got nothing to do with any previous locations he was at or directions he went in.

The question, does he eventually get further and further away from where he started? Or does he tend to hover around the same point?

It would seem intuitively that the drunk should on average oscillate around the starting point, because the movements in each direction will on average cancel each other out.

This is true, up to a point. But imagine that by chance he staggers twelve steps in one direction. Okay, he's now at point X. Remember his future movements are independent of his past ones. So if he hovers on average around his starting point, now his starting point is point X, not his old location twelve steps away. And this is a completely general aspect of a random walk: wherever you are, however you got there, that is now your starting point, and that's now where averages plus and minus should be counted from.

Same thing with coin tossing. On average you get about the same number of heads and tails. But if you've had a run such that you've now got 5 more H than T, then on average this preponderance will continue. It becomes less and less important because in the long run the proportion of heads to tails approaches 0.5 ever more closely. But it can do this while the absolute difference in numbers becomes greater (from time to time). This factor is often forgotten in gambling, much to the discomfiture of the gambler.

In fact, in the long run, on average, every point will be visited. It will take longer and longer, the further away it is from wherever you are now, but the probability is 1 that you'll eventually hit any value. You will eventually be one million steps north of where you started, or have tossed one million more tails than heads.

And you'll come back to it over and over again: this will occur infinitely many times, again with probability 1. It just takes unbearably long to recur: in fact the intervals between occurrences of any given situation tend to infinity.

This is very easy to program and fascinating to watch for longer than you might think. I programmed a two-dimensional random walk with coloured points and used it as a screen saver, and like any good screen saver it could get you involved. I'd be rooting for red to move across to the left and blot out that last little enclave of aqua, while aqua and yellow were trying to strangle each other in the far upper right.

With coin tosses, program it to display only when the difference exceeds a previous limit. So at first it goes wild, then it settles down with, say, 34 more heads than tails. Most of the time the difference is less, occasionally you see it flip up to 35, then 36, as the random walk takes it higher up in that direction. But after a long time you'll see it suddenly display, say, -27, meaning it's random walked right across to the opposite end and exceeded the other limit. This continues, flipping to and fro, ever more slowly.

The probability of 1 applies both one-dimensionally along a line and two-dimensionally around a plane. Surprisingly, it's no longer true in three dimensions. A random walk in space has only about 0.35 probability of ever reaching any given point. This was proved in 1921 by George Pólya.

With the probabilities equal, as in the examples I've used, it is as ariels said a "simple" random walk, but I think it's important to explain the concept clearly under its simpler name of "random walk", because that's the one that gets bandied about so much.

Log in or register to write something here or to contact authors.