Newton's Method for Finding Roots

Site: Saylor Academy
Course: MA005: Calculus I
Book: Newton's Method for Finding Roots
Printed by: Guest user
Date: Friday, May 3, 2024, 1:40 PM

Description

Read this section. Work through practice problems 1-6.

Newton's Method for Finding Roots

Newton's method is a process which can find roots of functions whose graphs cross or just kiss the x–axis. Although this method is a bit harder to apply than the Bisection Algorithm, it often finds roots that the Bisection Algorithm misses, and it usually finds them faster.



Source: Dale Hoffman, https://s3.amazonaws.com/saylordotorg-resources/wwwresources/site/wp-content/uploads/2012/12/MA005-3.8-Newtons-Method.pdf
Creative Commons License This work is licensed under a Creative Commons Attribution 3.0 License.

Off on a Tangent

The basic idea of Newton's Method is remarkably simple and graphic (Fig. 1):



at a point (\mathrm{x}, \mathrm{f}(\mathrm{x})) on the graph of \mathrm{f}, the tangent line to the graph of \mathrm{f} "points toward" a root of \mathrm{f}, a place where the graph touches the \mathrm{x}-axis.

If we want to find a root of f, all we need to do is pick a starting value x_{0}, go up or down to the point \left(\mathrm{x}_{0}, \mathrm{f}\left(\mathrm{x}_{0}\right)\right) on the graph of \mathrm{f}, build a tangent line there, and follow the tangent line to where it crosses the \mathrm{x}-axis, say at \mathrm{x}_{1}.

If x_{1} is a root of f, then we are done. If x_{1} is not a root of f, then x_{1} is usually closer to the root than \mathrm{x}_{0} was, and we can repeat the process, using \mathrm{x}_{1} as our new starting point. Newton's method is an iterative procedure, that is, the output from one application of the method becomes the starting point for the next application.

Let's start with a differentiable function f(x)=x^{2}-5, (Fig. 2) whose roots we already know, \mathrm{x}=\pm \sqrt{5} \approx \pm 2.236067977, and illustrate how Newton's method works. First we pick some value for x_{0}, say x_{0}=4 for this example, and move to the point \left(x_{0}, f\left(x_{0}\right)\right)= (4,11) on the graph of \mathrm{f}.


At (4,11), the graph of \mathrm{f} "points to" a location on the \mathrm{x}-axis which is closer to the root of \mathrm{f} (Fig. 3). We can calculate this location on the \mathrm{x}-axis by finding the equation of the line tangent to the graph of \mathrm{f} at the point (4,11) and then finding where this tangent line intersects the \mathrm{x}-axis:


At the point (4,11), the line tangent to \mathrm{f} has slope \mathrm{m}=\mathrm{f}^{\prime}(4)=2(4)=8, so the equation of the tangent line is \mathrm{y} -11=8(x-4). Setting y=0, we can find where the tangent line crosses the x-axis:

0-11=8(x-4), so x=4-\frac{11}{8}=\frac{21}{8}=2.625

Call this new value x_{1}: x_{1}=2.625.

The point \mathrm{x}_{1}=2.625 is closer to the actual root, but it certainly does not equal the actual root. If Newton's method stopped after one step with the estimate of 2.625, it would not be very useful. Instead, we can use this new value for \mathrm{x}, \mathrm{x}_{1}=2.625, to repeat the procedure (Fig. 4):


(i) move to the point \left(\mathrm{x}_{1}, \mathrm{f}\left(\mathrm{x}_{1}\right)\right)=(2.625,1.890625) on the graph,

(ii) find the equation of the tangent line at the point \left(\mathrm{x}_{1}, \mathrm{f}\left(\mathrm{x}_{1}\right)\right):

y-1.890625=5.25(x-2.625)

(iii) find the new value where the tangent line intersects the \mathrm{x}-axis and call it \mathrm{x}_{2} \cdot\left(\mathrm{x}_{2}=2.262880952\right)

When we continue repeating this process, (Fig. 5) using each new estimate for the root of f(x)=x^{2}-5 as the beginning point for calculating the next estimate, we get:

 \begin{array}{lll} \text { Beginning estimate: } & \mathrm{x}_{0}=4 & & \text { (} 0 \text { correct digits) } \\ \text { after } 1 \text { iteration: } & \mathrm{x}_{1}=\underline{2.625} & & \text { (} 1 \text { correct digit }) \\ \text { after } 2 \text { iterations: } & \mathrm{x}_{2}=\underline{2.262880952} & & \text { (} 2 \text { correct digits) } \\ \text { after 3 iterations: } & \mathrm{x}_{3}=\underline{2.236251252} & & \text { (} 4 \text { correct digits) } \\ \text { after 4 iterations: } & \mathrm{x}_{4}=\underline{2.236067985} & & \text { (} 8 \text { correct digits) } \end{array}


It only took 4 iterations to get an approximation of \sqrt{5} which is within 0.000000008 of the exact value. One more iteration gives an approximation \mathrm{x}_{5} which has 16 correct digits. If we start with \mathrm{x}_{0}=-2 (or any negative number), then the values of \mathrm{x}_{\mathrm{n}} approach -\sqrt{5} \approx-2.23606

Fig. 6 shows the process for Newton's Method, starting with \mathrm{x}_{0} and graphically finding the locations on the \mathrm{x}-axis of \mathrm{x}_{1}, \mathrm{x}_{2}, and \mathrm{x}_{3}.


Practice 1: Find where the tangent line to \mathrm{f}(\mathrm{x})=\mathrm{x}^{3}+3 \mathrm{x}-1 at (1, \, 3) intersects the \mathrm{x}-axis.

Practice 2: A starting point and a graph of \mathrm{f} are given in Fig. 7. Label the approximate locations of the next two points on the \mathrm{x}-axis which are found by Newton's method.


The Algorithm for Newton's Method

Rather than deal with each particular function and starting point, let's find a pattern for a general function \mathrm{f}. For the starting point \mathrm{x}_{0}, the slope of the tangent line at the point \left(\mathrm{x}_{0}, \mathrm{f}\left(\mathrm{x}_{0}\right)\right) is \mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right) so the equation of the tangent line is y-f\left(x_{0}\right)=f^{\prime}\left(x_{0}\right)\left(x-x_{0}\right). This line intersects the x-axis when y=0, so

is y-f\left(x_{0}\right)=f^{\prime}\left(x_{0}\right)\left(x-x_{0}\right). This line intersects the x-axis when y=0, so 0-\mathrm{f}\left(\mathrm{x}_{0}\right)=\mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right)\left(\mathrm{x}-\mathrm{x}_{0}\right) and \mathrm{x}_{1}=\mathrm{x}=\mathrm{x}_{0}-\frac{\mathrm{f}\left(\mathrm{x}_{0}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right)}. Starting with \mathrm{x}_{1} and repeating this process we have \mathrm{x}_{2} =x_{1}-\frac{f\left(x_{1}\right)}{f^{\prime}\left(x_{1}\right)}; starting with x_{2}, we get x_{3}=x_{2}-\frac{f\left(x_{2}\right)}{f^{\prime}\left(x_{2}\right)} ; and so on.

In general, if we start with x_{n}, the line tangent to the graph of f at the point \left(x_{n}, f\left(x_{n}\right)\right) intersects the x-axis at the point x_{n+1}=x_{n}-\frac{f\left(x_{n}\right)}{f^{\prime}\left(x_{n}\right)}, our new estimate for the root of f.

Algorithm for Newton's Method:

(1) Pick a starting value \mathrm{x}_{0} (preferably close to a root of \mathrm{f}).

(2) For each estimate \mathrm{x}_{\mathrm{n}}, calculate a new estimate \mathrm{x}_{\mathrm{n}+1}=\mathrm{x}_{\mathrm{n}}-\frac{\mathrm{f}\left(\mathrm{x}_{\mathrm{n}}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{\mathrm{n}}\right)}.

(3) Repeat step (2) until the estimates are "close enough" to a root or until the method "fails".

When the algorithm for Newton's method is used with \mathrm{f}(\mathrm{x})=\mathrm{x}^{2}-5, the function at the beginning of this section, we have \mathrm{f}^{\prime}(\mathrm{x})=2 \mathrm{x} so

\begin{aligned} x_{n+1} &=x_{n}-\frac{f\left(x_{n}\right)}{f^{\prime}\left(x_{n}\right)}=x_{n}-\frac{x_{n}^{2}-5}{2 x_{n}}=\frac{2 x_{n}^{2}-\left(x_{n}^{2}-5\right)}{2 x_{n}} \\ &=\frac{x_{n}^{2}+5}{2 x_{n}}=\frac{1}{2}\left\{x_{n}+\frac{5}{x_{n}}\right\} \end{aligned}

The new approximation, x_{n+1}, is the average of the previous approximation, x_{n}, and \mathrm{5} divided by the previous approximation, 5 / \mathrm{x}_{\mathrm{n}}. Problem 16 asks you to show that this pattern, called Heron's method, approximates the square root of any positive number. Just replace the "\mathrm{5}" with the number whose square root you want.


Example 1: Use Newton's method to approximate the root(s) of \mathrm{f}(\mathrm{x})=2 \mathrm{x}+\mathrm{x} \sin (\mathrm{x}+3)-5. Solution: \mathrm{f}^{\prime}(\mathrm{x})=2+\mathrm{x} \cos (\mathrm{x}+3)+\sin (\mathrm{x}+3) so

x_{n+1}=x_{n}-\frac{f\left(x_{n}\right)}{f^{\prime}\left(x_{n}\right)}=x_{n}-\frac{2 x_{n}+x_{n} \sin \left(x_{n}+3\right)-5}{2+x_{n} \cos \left(x_{n}+3\right)+\sin \left(x_{n}+3\right)}

The graph of \mathrm{f}(\mathrm{x}) for -4 \leq \mathrm{x} \leq 6 (Fig. 8) indicates only one root of \mathrm{f}, and that root is near \mathrm{x}=3 so pick \mathrm{x}_{0}=3. Then Newton's method yields the values \mathrm{x}_{0}=3, \mathrm{x}_{1}=\underline{2.96484457} \mathrm{x}_{2}=\underline{2.96446277}, \mathrm{x}_{3}=\underline{2.96446273} \quad (the underlined digits agree with the exact root).


If we had picked \mathrm{x}_{0}=4, Newton's method would have required \mathrm{4} iterations to get \mathrm{9} digits of accuracy. If \mathrm{x}_{0}=5, then \mathrm{7} iterations are needed to get \mathrm{9} digits of accuracy. If we pick x_{0}=5.1, then the values of x_{n} are not close to the actual root after even \mathrm{100} iterations, \mathrm{x}_{100} \approx \, -49.183. Picking a good value for \mathrm{x}_{0} can result in values of \mathrm{x}_{\mathrm{n}} which get close to the root quickly. Picking a poor value for \mathrm{x}_{0} can result in \mathrm{x}_{\mathrm{n}} values which take longer to get close to the root or which don't approach the root at all.

Note: An examination of the graph of the function can help you pick a "good" \mathbf{x}_{0}.


Practice 3: Put \mathrm{x}_{0}=3 and use Newton's method to find the first two iterates, x_{1} and x_{2}, for the function f(x)=x^{3}-3 x^{2}+x-1


Example 2: The function in Fig. 9 has roots at \mathrm{x}=3 and x=7. If we pick x_{0}=1 and apply Newton's method, which root do the iterates, the \mathrm{x}_{\mathrm{n}}, approach?


Solution: The iterates of \mathrm{x}_{0}=1 are labeled in Fig. 10. They are approaching the root at \mathrm{7}.


Practice 4: For the function in Fig. 11, which root do the iterates of Newton's method approach if (a) x_{0}=2?


(b) \mathrm{x}_{0}=3?

(c) \mathrm{x}_{0}=5?

Iteration

We have been emphasizing the geometric nature of Newton's method, but Newton's method is also an example of iterating a function. If \mathrm{N}(\mathrm{x})=\mathrm{x}-\frac{\mathrm{f}(\mathrm{x})}{\mathrm{f}^{\prime}(\mathrm{x})}, the "pattern" in the algorithm, then

\mathrm{x}_{1}=\mathrm{x}_{0}-\frac{\mathrm{f}\left(\mathrm{x}_{0}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right)}=\mathrm{N}\left(\mathrm{x}_{0}\right),

\mathrm{x}_{2}=\mathrm{x}_{1}-\frac{\mathrm{f}\left(\mathrm{x}_{1}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{1}\right)}=\mathrm{N}\left(\mathrm{x}_{1}\right)=\mathrm{N}\left(\mathrm{N}\left(\mathrm{x}_{0}\right)\right)=\mathrm{N} \circ \mathrm{N}\left(\mathrm{x}_{0}\right),

\mathrm{x}_{3}=\mathrm{N}\left(\mathrm{x}_{2}\right)=\mathrm{N} \circ \mathrm{N} \circ \mathrm{N}\left(\mathrm{x}_{0}\right), and, in general,

\mathrm{x}_{\mathrm{n}}=\mathrm{N}\left(\mathrm{x}_{\mathrm{n}-1}\right)=\mathrm{n}^{\text {th }} iteration of \mathrm{N} starting with \mathrm{x}_{0} .

At each step, we are using the output from the function \mathrm{N} as the next input into \mathrm{N}.

What Can Go Wrong?

When Newton's method works, it usually works very well and the values of the \mathrm{x}_{\mathrm{n}} approach a root of \mathrm{f} very quickly, often doubling the number of correct digits with each iteration. There are, however, several things which can go wrong.

One obvious problem with Newton's method is that \mathrm{f}^{\prime}\left(\mathrm{x}_{\mathrm{n}}\right) can be 0. Then we are trying to divide by 0 and x_{n+1} is undefined. Geometrically, if f^{\prime}\left(x_{n}\right)=0, then the tangent line to the graph of f at x_{n} is horizontal and does not intersect the \mathrm{x}-axis at one point (Fig. 12). If \mathrm{f}^{\prime}\left(\mathrm{x}_{\mathrm{n}}\right)=0, just pick another starting value \mathrm{x}_{0} and begin again. In practice, a second or third choice of \mathrm{x}_{0} usually succeeds.

There are two other less obvious difficulties that are not as easy to overcome - the values of the iterates \mathrm{x}_{\mathrm{n}} may become locked into an infinitely repeating loop (Fig. 13), or they may actually move farther away from a root (Fig. 14).

  



Example 3: Put x_{0}=1 and use Newton's method to find the first two iterates, \mathrm{x}_{1} and \mathrm{x}_{2}, for the function \mathrm{f}(\mathrm{x}) =x^{3}-3 x^{2}+x-1.

Solution: This is the same function as in the previous Practice problem, but we are using a different starting value for x_{0}. f^{\prime}(x)=3 x^{2}-6 x+1 so

\mathrm{x}_{1}=\mathrm{x}_{0}-\frac{\mathrm{f}\left(\mathrm{x}_{0}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right)}=1-\frac{\mathrm{f}(1)}{\mathrm{f}^{\prime}(1)}=1-\frac{-2}{-2}=0 and \mathrm{x}_{2} = \mathrm{x}_{1}-\frac{\mathrm{f}\left(\mathrm{x}_{1}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{1}\right)}=0-\frac{\mathrm{f}(0)}{\mathrm{f}^{\prime}(0)}=0-\frac{-1}{1}=1

which is the same as \mathrm{x}_{0}, so \mathrm{x}_{3}=\mathrm{x}_{1}=0 and \mathrm{x}_{4}=\mathrm{x}_{2}=1. The values of \mathrm{x}_{\mathrm{n}} alternate between 1 and 0 and do not approach a root.

Newton's method behaves badly at only a few starting points for this particular function. For most starting points Newton's method converges to the root of this function.

There are some functions which defeat Newton's method for almost every starting point.


Practice 5: For \mathrm{f}(\mathrm{x})=\sqrt[3]{\mathrm{x}}=\mathrm{x}^{1 / 3} and \mathrm{x}_{0}=1, verify that \mathrm{x}_{1}=-2, \mathrm{x}_{2}=4, \mathrm{x}_{3}=-8. Also try \mathrm{x}_{0}=-3, and verify that the same pattern holds: \mathrm{x}_{\mathrm{n}+1}=-2 \mathrm{x}_{\mathrm{n}}. Graph \mathrm{f} and explain why the Newton's method iterates get farther and farther away from the root at \mathrm{0}.

Newton's method is powerful and quick and very easy to program on a calculator or computer. It usually works so well that many people routinely use it as the first method they apply. If Newton's method fails for their particular function, they simply try some other method.

Chaotic Behavior and Newton's Method

An algorithm leads to chaotic behavior if two starting points which are close together generate iterates which are sometimes far apart and sometimes close together: \left|a_{0}-b_{0}\right| is small but \left|a_{n}-b_{n}\right| is large for lots (infinitely many) of values of \mathrm{n} and \left|\mathrm{a}_{\mathrm{n}}-\mathrm{b}_{\mathrm{n}}\right| is small for lots of values of \mathrm{n}.

The iterates of the next simple algorithm exhibit chaotic behavior. 

A Simple Chaotic Algorithm: Starting with any number between \mathrm{0} and \mathrm{1}, double the number and keep the fractional part of the result: \mathrm{x}_{1} is the fractional part of 2 \mathrm{x}_{0}, \mathrm{x}_{2} is the fractional part of 2 \mathrm{x}_{1}, and in general, x_{n+1}=2 x_{n}-\left[2 x_{n}\right]=2 x_{n}-\operatorname{INT}\left(2 x_{n}\right).

If x_{0}=0.33, then the iterates of the algorithm are 0.66,0.32 (= fractional part of \left.2 \cdot 0.66\right), 0.64,0.28,0.56, \ldots The iterates for two other starting values close to \mathrm{33} are given below as well as the iterates of 0.470 and 0.471:

\begin{array}{rlllll}\text { start }= & \mathbf{x}_{\mathbf{0}} & \mathbf{0. 3 2} & \mathbf{0. 3 3} & \mathbf{0. 3 4} & \mathbf{0. 4 7 0} & \mathbf{0. 4 7 1} \\ & \mathrm{x}_{1} & 0.64 & 0.66 & 0.68 & 0.940 & 0.942 \\ & \mathrm{x}_{2} & 0.28 & 0.32 & 0.36 & 0.880 & 0.884 \\ & \mathrm{x}_{3} & 0.56 & 0.64 & 0.72 & 0.760 & 0.768 \\ & \mathrm{x}_{4} & 0.12 & 0.28 & 0.44 & 0.520 & 0.536 \\ & \mathrm{x}_{5} & 0.24 & 0.56 & 0.88 & 0.040 & 0.072 \\ & \mathrm{x}_{6} & 0.48 & 0.12 & 0.76 & 0.080 & 0.144 \\ & \mathrm{x}_{7} & 0.96 & 0.24 & 0.56 & 0.160 & 0.288 \\ & \mathrm{x}_{8} & 0.92 & 0.48 & 0.12 & 0.320 & 0.576 \\ & \mathrm{x}_{9} & 0.84 & 0.96 & 0.24 & 0.640 & 0.152\end{array}

There are starting values as close together as we want whose iterates are far apart infinitely often. Many physical, biological, and business phenomena exhibit chaotic behavior. Atoms can start out within inches of each other and several weeks later be hundreds of miles apart. The idea that small initial differences can lead to

dramatically diverse outcomes is sometimes called the "Butterfly Effect" from the title of a talk ("Predictability: Does the Flap of a Butterfly's Wings in Brazil Set Off a Tornado in Texas?") given by Edward Lorenz, one of the first

people to investigate chaos. The "butterfly effect" has important implications about the possibility, or rather the impossibility, of accurate long–range weather forecasting. Chaotic behavior is also an important aspect of studying turbulent air and water flows, the incidence and spread of diseases, and even the fluctuating behavior of the stock market. 

Newton's method often exhibits chaotic behavior, and, since it is a relatively easy to study, is often used as a model to study the properties of chaotic behavior. If we use Newton's method to approximate the roots of \mathrm{f}(\mathrm{x})=\mathrm{x}^{3}-\mathrm{x} (with roots 0,+1 and -1), then starting points which are very close together can have iterates which converge to different roots. The iterates of .4472 and .4473 converge to the roots \mathrm{0} and +1, respectively. The iterates of the middle point .44725 converge to the root -1, and the iterates of another nearby point, \sqrt{1 / 5} \approx.44721, simply cycle between -\sqrt{1 / 5} and +\sqrt{1 / 5} and do not converge at all.

Practice 6: Find the first 4 Newton's method iterates of \mathrm{x}_{0}=.997 and \mathrm{x}_{0}=1.02 for \mathrm{f}(\mathrm{x})=\mathrm{x}^{2}+1. Try two other starting values very close to \mathrm{1} (but not equal to \mathrm{1}) and find their first \mathrm{4} iterates. Use the graph of \mathrm{f}(\mathrm{x})= \mathrm{x}^{2}+1 to explain how starting points so close together can quickly have iterates so far apart.

Practice Problem Answers

Practice 1: \quad \mathrm{f}(\mathrm{x})=\mathrm{x}^{3}+3 \mathrm{x}+1 so \mathrm{f}^{\prime}(\mathrm{x})=3 \mathrm{x}^{2}+3 and the slope of the tangent line at the point (1,3) is \mathrm{f}^{\prime}(1) =6. Using the point-slope form for the equation of a line, the equation of the tangent line is y-3=6(x-1) or y=6 x-3.

The \mathrm{y}-coordinate of a point on the \mathrm{x}-axis is 0 so we need to put \mathrm{y}=0 and solve the linear equation for \mathrm{x}: 0=6 \mathrm{x}-3 so \mathrm{x}=1 / 2.

The line tangent to the graph of \mathrm{f}(\mathrm{x})=\mathrm{x}^{3}+3 \mathrm{x}+1 at the point (1,3) intersects the \mathrm{x}-axis at the point (\mathbf{1} / \mathbf{2}, \mathbf{0}).


Practice 2: The approximate locations of \mathrm{x}_{1} and \mathrm{x}_{2} are shown in Fig. 20.



Practice 3: \quad f(x)=x^{3}-3 x^{2}+x-1 so f^{\prime}(x)=3 x^{2}-6 x+1. x_{0}=3.

 \begin{aligned} &\mathrm{x}_{1}=\mathrm{x}_{0}-\frac{\mathrm{f}\left(\mathrm{x}_{0}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{0}\right)}=3-\frac{\mathrm{f}(3)}{\mathrm{f}^{\prime}(3)}=3-\frac{2}{10}=\mathbf{2. 8} \\ &\mathrm{x}_{2}=\mathrm{x}_{1}-\frac{\mathrm{f}\left(\mathrm{x}_{1}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{1}\right)}=2.8-\frac{\mathrm{f}(2.8)}{\mathrm{f}^{\prime}(2.8)}=2.8-\frac{0.232}{7.72} \approx 2.769948187 \\ &\mathrm{x}_{3}=\mathrm{x}_{2}-\frac{\mathrm{f}\left(\mathrm{x}_{2}\right)}{\mathrm{f}^{\prime}\left(\mathrm{x}_{2}\right)} \approx 2.769292663 \end{aligned}


Practice 4: Fig. 21 shows the first iteration of Newton's Method for x_{0}=2,3, and \mathrm{5}.


If \mathrm{x}_{0}=2, the iterates approach the root at \mathrm{a}.

If x_{0}=3, the iterates approach the root at \mathrm{c}.

If x_{0}=5, the iterates approach the root at \mathrm{a}.


Practice 5: \quad f(x)=x^{1 / 3} so f^{\prime}(x)=\frac{1}{3} x^{-2 / 3}.

If x_{0}=1, then \quad x_{1}=1-\frac{f(1)}{f^{\prime}(1)}=1-\frac{1}{1 / 3}=1-3=-2

 x_{2}=-2-\frac{\mathrm{f}(-2)}{\mathrm{f}^{\prime}(-2)}=-2-\frac{(-2)^{1 / 3}}{\frac{1}{3}(-2)^{-2 / 3}}=-2-\frac{-2}{1 / 3}=4

\mathrm{x}_{3}=4-\frac{\mathrm{f}(4)}{\mathrm{f}^{\prime}(4)}=4-\frac{(4)^{1 / 3}}{\frac{1}{3}(4)^{-2 / 3}}=4-\frac{4}{1 / 3}=-8, and so on

If x_{0}=-3, then x_{1}=-3-\frac{f(-3)}{f^{\prime}(-3)}=-3-\frac{(-3)^{1 / 3}}{\frac{1}{3}(-3)^{-2 / 3}}=-3+9=6

x_{2}=6-\frac{f(6)}{f^{\prime}(6)}=6-\frac{6^{1 / 3}}{\frac{1}{3} 6^{-2 / 3}}=6-\frac{6}{1 / 3}=-12

The graph of the cube root \mathrm{f}(\mathrm{x})=\mathrm{x}^{1 / 3} has a shape similar to Fig. 14, and the behavior of the iterates is similar to the pattern in that figure. Unless \mathrm{x}_{0}=0 (the only root of \mathrm{f}) the iterates alternate in sign and double in magnitude with each iteration: they get progressively farther from the root with each iteration.


Practice 6: If \mathrm{x}_{0}=0.997, then \mathrm{x}_{1} \approx-0.003, \mathrm{x}_{2} \approx 166.4, \mathrm{x}_{3} \approx 83.2, \mathrm{x}_{4} \approx 41.6.

If \mathrm{x}_{0}=1.02, then \mathrm{x}_{1} \approx 0.198, \mathrm{x}_{2} \approx-25.2376, \mathrm{x}_{3} \approx-12.6, \mathrm{x}_{4} \approx-6.26