This writeup deals with a Monte Carlo simulation in the field of Solid State Physics, in particular, surface diffusion. However, the more abstract elements should transfer well to other fields.

At a given temperature and configuration, each atom on the surface of a metal has a known chance of jumping into a neighboring lattice site. It also has some chance of knocking into another atom and them both moving at the same time. Complications of this can lead to interesting dynamics, which experiments can see some of but not give all of the details of. For example, one can take stunning visual profiles of the surface with an STM, but not quickly enough to see anything but the final steady state outcome; or one can bounce ions off the surface and tell the density of various species of atom even while it is still changing, but not know where in particular they are.

With a somewhat simplified model, one can write a Monte Carlo simulation of the time development of this system using several algorithms. I have worked with two of them in my thesis work at Haverford:

The Metropolis Algorithm

is named after Nicholas Metropolis, who invented it in the 1950's. The people working with him included the (in?)famous Edward Teller. The Algorithm is as follows (I use the term 'move' to refer to whatever simple transformations on the system the implementation deals with):

  1. Pick a random move (in our case, this meant pick an atom, and a random direction. If this direction leads into another atom, then pick another random direction for that second atom to move)
  2. Figure out how probable this move is, based on the presence of other nearby atoms and the temperature
  3. Get a random number on (0,1), and if the random number is lower than the probability, then carry out the move.
  4. Repeat, and advance the time by 1/(a constant prefactor).

The Metropolis Algorithm usually rejects moves. There are ways around this, but unless done carefully they can discount some of the more probable moves, and no matter what they will make the statistics more 'grainy' (not a technical term).

The BKL algorithm

is named after the three people who invented it in the 1970's, Bortz, Kalos, and Lebowitz. The algorithm is as follows:

  1. Figure out how likely every single possible move is.
  2. Pile all of the probabilities on top of each other. By this I mean assign each possible move a domain of the real numbers such that in (0,Z), where Z is the sum of the probabilities, each number has exactly one move associated with it.
  3. Pick a random number in between zero and Z.
  4. Carry out the move that this number corresponds to.
  5. Then figure out what new moves are possible, what old moves are no longer possible, and how all the other moves have changed.
  6. Repeat, advancing the time step by 1/(a constant prefactor x Z before the move).

Overview:

The Metropolis algorithm is much more widely used since each step is much faster, and since the computer programming required to make it work is much less mind-bending. However, if the probability of an arbitrary individual move is very low (if, let's say, the temperature is low, so the atoms do not have enough energy to overcome the barriers very often), then the Metropolis algorithm will spend almost all of its time throwing moves out... enough to slow it down beyond what the BKL algorithm would spend figuring out the consequences of its actions.


@waterhouse, below: Your distinction includes Monte-Carlo evaluations as Monte Carlo simulations. The distinction is important. Monte Carlo evaluations are experimental in nature, attempts to determine the probability of the various outcomes by trying to measure repeatedly. You start out with no information, and assume that the information that you get is typical (the more attempts, the more reliable). This is actually the quintessential experimental situation, and 'Monte Carlo' would not enter into the thought process of the experimenter.
Monte Carlo simulations are theoretical in nature, attempts to determine the behavior of a system based on simple assumptions stated in probabalistic terms. You start out with all of the information, but you stochastically synthesize it into a usable form.
All (or nearly all) active experimentation is Monte Carlo, while non-Monte Carlo theory is common.