📗 Non-linear equations in the form \(f\left(x\right) = 0\) are solved using iterative methods.
➩ Start with a random guess \(x_{0}\), and compute a sequence \(x_{1}, x_{2}, ...\) with the property that \(x^\star = \lim_{i \to \infty} x_{i}\) satisfies \(f\left(x^\star\right) = 0\).
➩ Gradient descent is an optimization technique that tries to optimize \(f\left(x\right)\) by iteratively solving \(f'\left(x\right) = 0\) or \(\nabla_{f} \left(x\right) = 0\).
📗 An algebraic equation, also called a polynomial equation, is in the form \(a_{0} + a_{1} x + a_{2} x^{2} + ... + a_{n} x^{n} = 0\).
➩ There are in general \(n\) solutions or roots (possibly complex or repeated) to the above equation.
Course Evaluation
📗 Please complete the course evaluation. Thank you! .
➩ A: I have submitted the course evaluation.
➩ B: I am planning to submit the course evaluation.
➩ C: I am not planning to submit the course evaluation.
📗 Intermediate Value Theorem (IVT) says that given a continuous function \(f\), for any \(u\) between \(f\left(a\right)\) and \(f\left(b\right)\), there exists an \(x \in \left[a, b\right]\) such that \(f\left(x\right) = u\).
➩ IVT implies that if \(f\left(a\right) \geq 0\) and \(f\left(b\right) \leq 0\), then there exists an \(x \in \left[a, b\right]\) such that \(f\left(x\right) = 0\).
➩ Bisection method uses this observation to iteratively reduce the interval \([a, b]\) that contains the root by a half until \(a\) and \(b\) are close enough.
📗 Bisection method can be used to find a root of \(f\left(x\right) = 0\) in an interval \(x \in \left[x_{0}, x_{1}\right]\).
➩ Start with \(\left[x_{0}, x_{1}\right]\) and \(x = \dfrac{1}{2} \left(x_{0} + x_{1}\right)\).
➩ If \(f\left(x\right)\) and \(f\left(x_{0}\right)\) has different signs, the solution is between \(x_{0}\) and \(x\), use bisection method on \(\left[x_{0}, x\right]\).
➩ If \(f\left(x\right)\) and \(f\left(x_{1}\right)\) has different signs, the solution is between \(x\) and \(x_{1}\), use bisection method on \(\left[x, x_{1}\right]\).
➩ Stop when \(f\left(x\right) = 0\) or \(x_{0}\) and \(x_{1}\) are close enough.
📗 Newton's method can be used to find a root of \(f\left(x\right) = 0\), given \(f'\left(x\right)\), starting from initial guess \(x_{0}\), preferably close to the solution.
➩ Start with the initial guess \(x = x_{0}\).
➩ Repeat using Newton's formula \(x = x - \dfrac{f\left(x\right)}{f'\left(x\right)}\).
➩ Stop when \(f\left(x\right)\) is close enough to \(0\) (or the number of iterations is too large).
📗 Newton's method could get stuck when \(f'\left(x\right) = 0\), in that case, start with different random initial guess.
📗 Newton's method could also diverge around an unstable root, in that case, a variant of Newton's method need to be used.
📗 Secant method is used instead of Newton's method when the derivative function is unknown or costly to compute.
➩ Two initial guesses are required, \(x_{0}\) and \(x_{1}\), and the Newton's update is replaced by \(x = x - \dfrac{f\left(x\right)}{\dfrac{f\left(x\right) - f\left(x'\right)}{x - x'}} = \dfrac{x' f\left(x\right) - x f\left(x'\right)}{f\left(x\right) - f\left(x'\right)}\), where \(x'\) is the \(x\) in the previous iteration.
📗 Secant method is not the same as Newton's method with the numerical derivative computed using finite differences, but \(x\) and \(x'\) are close, a step using Secant method does approximate a step using Newton's method.
➩ In general, Newton's method usually takes few iterations.
➩ If it is costly to evaluate \(f'\left(x\right)\), the Secant method could be faster.
📗 fzero(f, [x0, x1]) searches (bisection) for the solution of \(f\left(x\right) = 0\) between \(x_{0}\) and \(x_{1}\), assuming \(f\left(x_{0}\right) f\left(x_{1}\right) \leq 0\).
📗 fzero(f, x0) starts at \(x_{0}\) and search for the solution of \(f\left(x\right) = 0\) using a variant of the Secant method.
📗 Both Newton's method and Secant method can be extended to solving a system of non-linear equations \(F\left(x\right) = 0\). The Jacobian matrix is used in place of the derivative. The updates are given by \(x = x - J^{-1}_{F} \left(x\right) F\left(x\right)\).
📗 Notes and code adapted from the course taught by Professors Beck Hasti and Michael O'Neill.
📗 You can expand all TopHat Quizzes and Discussions: .
📗 If there is an issue with TopHat during the lectures, please submit your answers on paper (include your Wisc ID and answers) or this Google form Form at the end of the lecture.