Newton's Method for Root Finding Uses basic calculus to find the root of a function Idea: Start with an initial guess x0 and repeatedly update x(k+1) = x(k) - f(x(k))/f'(x(k)) where f'(x(k)) is the first derivative of f evaluated at x(k). Why this update? Taylor's theorem: A function can be approximated near x using the first derivative at x. f(x(k)+p) approx= f(x(k)) + f'(x(k))p -> f(x(k+1)) approx= f(x(k)) + f'(x(k))(x(k+1)-x(k)) Remember, our goal is to find x(k+1) such that f(x(k+1)) = 0. So set this equal to 0 and solve for x(k+1)! 0 = f(x(k)) + f'(x(k))(x(k+1)-x(k)) -> x(k+1) = x(k) - f(x(k))/f'(x(k)) Aside: Similar idea applies to many other areas. One particular example is minimizing a function f (finding a point x with the minimum function value for f). In this case, we'd set x(k+1) = x(k) - c f'(x(k)) so that f(x(k+1)) approx= f(x(k)) - c (f'(x(k)))^2 Known as gradient descent, basic algorithm in optimization. Applies to problems in machine learning/statistics, economics, operations research engineering. Check out CS/ECE/ISyE 524 for more info. Can use calculus to show that the error decreases at a rate that is quadratic in the previous error, aka quadratic convergence (much faster than bisection). Note: This only works for smooth, continuously differentiable functions. Walk through Newton code Note: max iterations, may not converge if started at a bad point Example: Apply bisection method to f(x) = 6 - 2x^2 Show plot again Set up problem Apply Newton's method Note: Many fewer iterations than bisection!