A matrix-free algorithm based on Newton's method for overdetermined nonlinear system of equations

The need for solving nonlinear system of equations arise in many fields of research.Several methods were designed for solving such problems, each one with different computa­tional requirements. For overdetermined system, methods like Levenberg-Marquardt have been used but they are not suitable if the system of equations is large. Quasi-Newton's method does not necessarily work since a root may not exist. In this paper we propose a matrix-free and parallel algorithm to minimize the residual of an overdetermined system of nonlinear equations. It is based on the Jacobian-Free Newton-Krylov method, a matrix-free alternative to classical Newton's method. The proposed algorithm requires the transpose of the Jacobian matrix, so we propose a new way for computing the transpose of the Jacobian matrix times an arbitrary vector. Numerical results are presented to verify the effectiveness of the proposed approximations and its capabilities for minimizing the residual.