An iteration method for converging on a root of a differentiable function. If the current point is some xn, here's how you get to the next point xn+1: take the point (xn,f(xn)) on the graph of the function, and extend the tangent to the graph at that point until it hits the X axis; that's your next point. So, if the graph of f(x) is a straight line, you'll reach the root in one iteration.
In symbols, you get this formula:
xn+1 = xn - f(xn)/f'(xn)
Obviously, we don't want f'(x) to become anywhere near where we're iterating, or we'll divide by 0 (or by a very small number, which can be worse). But under "good" conditions (continuous f'(x), with non-zero derivative at the root), the method converges when x1 is "close enough" to the root, and in fact doubles the number of accurate digits at every iteration! A similar situation exists for complex functions.
You can get similar iterations in the multivariate case; there the formula becomes the vector formula
xn+1 = xn - (f'(xn))-1 ⋅ f(xn)
) is a vector, and f'(xn
) is a matrix; the condition in the scalar case that f'(x) be far from 0 is replaced by conditions that f'(x) be well conditioned
, so that f'(x) can be inverted (and accuracy not lost).
However, analysing convergence in this case is very hard (but when it works, it works great!)
A nice example of non-convergence of the Newton-Raphson method is given by taking f(x)=cbrt(x) (cube root) and working out the iteration (it comes out as xn+1 = -2*xn, which diverges very rapidly).
The Newton-Raphson method requires you to know the derivative of the function. Sometimes this is inconvenient, imprecise (derivatives increase error), impractical (due to looking for the zero of a function determined empirically rather than analytically) or impossible (if you're writing a general root finding function, and your chosen API doesn't let you ask also for the derivative of the function). The closest method which doesn't use derivatives is the "regula falsi" method (false secant iteration).