Quantum mechanics tells us that when we measure an observable quantity of a system, we only ever see certain values -- the eigenvalues of a Quantum Operator, and that afterwards, the system will behave as if it were in the eigenstate corresponding to that eigenvalue.

It turns out that at least in simple systems, the process of repeatedly measuring a value can inhibit the transition of that system to any other state. In effect, every measurement resets the clock for a transition. To see this effect, we need only to make our measurements much faster than the typical time for a transition to occur.

This works very well for simple systems such as an electron spin, or photon polarization. However, for more unstable and complicated systems like a radioactive nucleus, the energy pertubations added by such repeated measurements can have the opposite effect, called the anti-zeno effect, and greatly speed up the decay.

The Zeno effect is named after the Greek philospoher Zeno who lived in the fourth century B.C. Zeno presented several paradoxes regarding infinity, most known are Achilles and the Tortoise and the flying arrow

Anyhow, the quantum Zeno effect was proposed by  George Sidarshan and Baidyanaith Misra of University of Texas in 1977. It suggested, just like Norton_I says in the previous writeup, that continuous measurements on a quantum level would slow down quantum decay. This is often referred to as a watched pot never boils (see FFalcons writeup in that node).  

In 2000 Abraham Kofman and Gershon Kurizki of the Weizmann Institute in Israel suggested the opposite effect, the anti-Zeno effect, or boiling the pot by watching. They argued that the time frame in which the Zeno effect is valid, is very short. If you do not measure the effect often enough, then the effect will instead be that of speeding up the decay. 

Both these theories can co-exist, and a recent article in Nature Science Update reports on how Mark Raizen and colleagues at the University of Texas have managed to prove both these theories, for as complicated systems as atoms. Previously only the Zeno effect had been experimentally proven, and that was on one particle systems only. Raizen used Sodium atoms, which naturally decays through tunneling. By measuring the system every millionth of a second, the decay slowed  considerably. When instead measuring the system every five millionth second, the decay increased over the natural level. Thus, they seem to have proven both the Zeno effects on a never before seen level of complexity. 

The rest in this writeup is for background only. I'll try to explain a little more in detail what these two are, in terms of quantum mechanics. The quantum Zeno effect is due to what is called the collapse of the wave function. As you may or may not know, the wave quantum mechanical wave function for a particle is a probability distribution, which squared gives us the probability of finding the particle for a certain location. When the particle is measured upon, its wave function is said to collapse into a single defined state. As soon as the measurement is done, a new wave function will govern the particle. 

Anyhow, the key is that if you measure the location - or any other quantum characteristic - often enough, you will effect the natural speed of events, such as decay. I'll use a common example: The event where an excited particle returns to its ground state. (See my laser writeup for a short piece on energy states) 

The probability for a particle to go from a higher, called 2, to a lower energy state, called 1,  is

P2-1( t ) = a · t2

where P is the probability, t the time you wait between measurements and a is some kind of constant. Now, the probability for the particle to stay in the higher state is thus

P2-2( t ) = 1 - a · t2

Now, consider that you wait n times longer between measurement, ie you wait n · t. Then the equation would change to

P2-2( t ) = 1 - a · ( n · t )2 = 1 - a · n2 · t2  (1)

This equation shows that if you never ceased measuring the particle, n ⇒ 0, then the probability P ⇒ 1, which makes sense. The particle will never decay if we measure it constantly.

If you instead measure the particle 2 times, with the interval t, the probability for the particle to be in the same state 2 becomes the product of each of the probabilities, or

P2-2( t ) = ( 1 - a · t2 )2

By using the binomial approximation, this can be written

P2-2( t ) = 1 - 2·a·t^2

which in turn can be generalized to measurement n times of intervals t to (by approximation)

P2-2( t ) = 1 - n·a·t^2    (2)

Comparing equations (1) and (2), we see that the more often we measure, the greater the probability for the particle to remain in state 2. In equation (1) we measure once after n·t wait, while in (2) we measure n times during the t time.

For the anti-Zeno effect, the theoretical argumentation is that decay events have some sort of memory time, and that the valid intervals for measurements are incredibly small. In fact so small, that the energies needed for measurement could well destroy the measured particle. This is due to the uncertainty principle, which makes the variations in energy large over small times. Researches say that if the intervals are not fast enough, then the effect could well be the reverse. How this anti-Zeno effect theory will be affected by the new results mentioned further up, remains to see. 

Source: Scientific American and Andrew Hamilton at Dalhousie University for formulae.

Log in or register to write something here or to contact authors.