There is an indirect proof for a function having only one root in a given interval, which involves the use of the Mean Value Theorem. Allow me to illustrate this with an example:

Let: f(x) = x^2 + 2x.cos(x) - 1
a) Show that f'(x) > 0
f'(x) = 2x + 2cos(x) - 2xsin(x) = 2x(1-sin(x)) + 2cos(x)
0 <= sin(x) <= 1
-1 <= -sin(x) <= 0
0 <= 1-sin(x) <= 1
cos(x) > 0

b) Using the Intermediate Value Theorem show that f(x) has a root in the range (0,1).
f(0) = -1
f(1) = 2cos1 > 0
f(x) is continuous for this interval and it's value goes from -ve to +ve: Thus by the Intermediate Value Theorem it must have at least one root in the said interval.

c) Using the Mean Value Theorem show that f(x)=0 has only one root in the interval (0,1)
We shall prove this by contradiction. (See: Proof By Contradiction).
Assume that f(x) has 2 roots, c1 and c2 in the given interval.
f(c1) = f(c2) = 0
By the mean value theorem we have:
f'(c).(c2 - c1) = f(c2) - f(c1)
f'(c) = 0
But in part a it was shown that f'(c) > 0 in the range (a,b)
We have a contradiction, and thus the original assumption that there are two roots, has been disproven.

See also: Intermediate Value Theorem, Mean Value Theorem

There is actually a very simple way to understand this physically. If a function is everywhere differentiable then the only way its graph can turn is if its derivative becomes zero and then changes sign.
This means that if a differentiable function crosses the x-axis once then unless its derivative becomes zero and changes sign it cannot turn back for another crossing. If the derivative is always positive the function keeps increasing and if the derivative is always negative then the function keeps decreasing. Thus if the derivative maintains its sign, the function cannot have two roots.

Log in or register to write something here or to contact authors.