A fast algorithm for multiplying polynomials. The naïve algorithm multiplies term by term, yielding time complexity of O(m*n) (where m,n are the number of terms in each polynomial, and assuming unit time for arithmetic operations).

But we can do better here as well! For simplicity, I show the case where n=m=2^{k+1}. We wish to multiply P(x) by Q(x), so we write

P(x) = P_{1}(x) + x^{2k} ⋅ P_{2}(x)

Q(x) = Q_{1}(x) + x^{2k} ⋅ Q_{2}(x)

and we wish to calculate

P(x)*Q(x) = P_{1}(x)*Q_{1}(x) +
x^{2k} ⋅ (P_{1}(x)*Q_{2}(x) +
P_{2}(x)*Q_{1}(x)) +
x^{2k+1} ⋅ P_{2}(x)*Q_{2}(x).

So define 3 multiples:

- A(x) = P
_{1}(x)*Q_{1}(x)
- B(x) = P
_{2}(x)*Q_{2}(x)
- C(x) = (P
_{1}(x)+P_{2}(x))*(Q_{1}(x)+Q_{2}(x))

Then

P(x)*Q(x) = A(x) + x^{2k} ⋅ (C(x)-A(x)-B(x)) +
x^{2k+1} ⋅ B(x).

Multiply each of the 3 products using a similar algorithm, until some small value of k is reached (say, k=1). The net effect is to translate the 4 multiplications required by the naïve algorithm into just 3 multiplications.

A similar calculation to that of the Strassen algorithm for matrix multiplication shows that the algorithm runs in time O(3^{k}) = O(n^{lg2 3}) = O(n^{1.585}).

Similar algorithms exist for matrix multiplication and complex number multiplication.