Skip to main content

Section 7.5 Newton's method

Our final application of differentiation is a very cool application of linear approximations. We study how repeated applications of linear approximations give rise to a very efficient numerical algorithm for finding roots of functions, known as “Newton's method”. We have already seen in Section 6.1 another method for finding roots of a function, namely the bisection method, which resulted from repeated applications of the Intermediate Value Theorem. It turns out that Newton's method is a much more efficient algorithm for finding roots of a function (however, Newton's method may sometime fail for some choices of initial conditions). In fact, one can use Newton's method to calculate square roots, such as \(\sqrt{42}\text{,}\) to fairly good accuracy, in your head!

Subsection 7.5.1 Instructional video

Subsection 7.5.2 Key concepts

Concept 7.5.1. Newton's method.

To find a root \(r\) of a differentiable function \(f(x)\) near an \(x\)-value of \(x_1\text{,}\) we apply Newton's Method. The idea is to replace the function by its linearization at \(x_1\text{,}\) and then find the \(x\)-intercept \(x_2\) of the linearization, which is a first approximation of the root of \(f(x)\text{.}\) We then repeat the process at \(x_2\text{,}\) and so on, until we reach a desired numerical precision for the root of \(f(x)\text{.}\) More precisely:

  1. First we calculate \(x_2\text{,}\) the \(x\)-intercept of the tangent line of \(f\) at \((x_1,f(x_1))\) (the linearization of \(f\) at \(x_1\)). We get
    \begin{equation*} x_2=x_1-\frac{f(x_1)}{f'(x_1)}. \end{equation*}
  2. Then we find \(x_3\text{,}\) the \(x\)-intercept of the tangent line of \(f\) at \((x_2,f(x_2))\text{,}\) and get
    \begin{equation*} x_3=x_2-\frac{f(x_2)}{f'(x_2)}. \end{equation*}
  3. We can repeat this process as many times as desired. At each step, the \(x\)-intercept of the tangent line of \(f\) at \((x_n, f(x_n))\) is given by
    \begin{equation*} x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}. \end{equation*}
  4. If we want an answer valid for \(k\) decimal places, we stop the process when two successive values \(x_n\) and \(x_{n+1}\) have the exact same first \(k\) decimals. We conclude that \(x_{n+1}\) is then a root of \(f(x)\text{,}\) up to a precision of \(k>\) decimal places.

Note that Newton's method may sometimes converge very slowly. In fact, it can also fail, for a number of reasons, for instance if the starting point \(x_1\) is not chosen appropriately.