Multivariate Root Finding#
Imagine a vector function,
where
We want to find the \({\bf x}\) that zeros \({\bf f}({\bf x})\), i.e.,
We’ll use a generalization of Newton’s method to systems of equations.
Solution procedure#
Note
This is the simplest case of a multivariable root finding algorithm. More sophisticated methods exist, but this method will give you a feel for what is involve
Start with an initial guess, \({\bf x}^{(0)}\)
Taylor expand our function seeking a correction \(\delta {\bf x}\). Each component has the form:
and we can write the vector form as:
where \({\bf J}\) is the Jacobian:
This can be expresses as the linear system:
Our solution procedure is iterative:
Iterate over \(k\), seeking a new guess at the solution \({\bf x}^{k+1}\):
Solve the linear system:
\[{\bf J}\delta {\bf x} = - {\bf f}({\bf x}^{(k)})\]Correct our initial guess:
\[{\bf x}^{(k+1)} = {\bf x}^{(k)} + \delta {\bf x}\]Stop iteration if
\[\| \delta {\bf x} \| < \epsilon \| {\bf x}^{(k)} \|\]where \(\| \cdot \|\) is a suitable vector norm.