An algorithm for quantum computers capable of factoring and extracting discrete logarithms in polynomial time. The existence of this algorithm means that if anyone finds a way to make quantum computers practical, public key cryptography schemes like RSA and ElGamal will become totally worthless. Discovered by Peter Shor and presented in the paper **Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer** in 1994.

The algorithm performs integer factorization by finding the period of the sequence of the function *f*(*a*) = *x*^{a} (mod *N*), where *N* is the number to be factored. Evidently, this sequence repeatedly generates the elements of the cyclic group of powers of *x*, whose cardinality (and hence the period of the sequence) is the first power where *x*^{r} is congruent to 1 modulo *N*. For a very long sequence, as is the case for a very large number, this period would take a lot of work to find, but for a quantum computer, quantum parallelism can be exploited to obtain it.

The mathematical details for the method to obtain the period of a long sequence are too complicated to display here (unless someday browsers that support MathML become more common), but what is required is that the sequence be loaded into a register and a "quantum" discrete Fourier transform performed, and the contents of the register measured. The result will be a random integer multiple of the period of the sequence, so only a few repeat measurements need be made to find a pair of multiples which are relatively prime to each other that precisely determine the period.

Say now we have obtained the period of the sequence by the above method and know a suitable value of *r*. We can then rewrite *x*^{r} ≡ 1 (mod *N*) as a difference of two squares:

(*x*^{r/2} + 1)(*x*^{r/2} - 1) ≡ 1 (mod *N*).

This says that the sum and difference on the left side of the equation has a factor in common with *N*. Now, we can use Euclid's Algorithm to calculate the greatest common divisor of these terms with *N*. Any non-trivial common divisor will be a prime factor of *N*. All of this computation, including the period length finding step, allows a quantum computer to perform factorizations in O(n^{2}) time.

For example, let us try to factor *N* = 91. If we choose *x* = 3, the sequence 3^{a} (mod 91) looks like:

1, 3, 9, 27, 81, 61, 1, 3, 9, 27, 81, 61, 1, ...

which evidently has a period of *r* = 6 as we can see (for big numbers, you'll really need to use the quantum computer method to do it, though). So 3^{6} ≡ 1 (mod 91) can be rewritten as (3^{3} + 1)(3^{3} - 1) ≡ 1 (mod 91) or 28⋅26 ≡ 1 (mod 91). This means that the GCD of either 28 and 91 or 26 and 91 is a non-trivial factor of 91. The GCD of 28 and 91 is 7, which is as we see a factor of 91.

Just today (December 20, 2001), a team of scientists from IBM and Stanford University announced that they have built a nuclear magnetic resonance (NMR) quantum computer with 7 qubits that was actually able to implement this algorithm to factor the number 15.