A Globally Convergent Inexact Newton Method for Systems of Monotone Equations

We propose an algorithm for solving systems of monotone equations which combines Newton, proximal point, and projection methodologies. An important property of the algorithm is that the whole sequence of iterates is always globally convergent to a solution of the system without any additional regularity assumptions. Moreover, under standard assumptions the local superlinear rate of convergence is achieved. As opposed to classical globalization strategies for Newton methods, for computing the stepsize we do not use linesearch aimed at decreasing the value of some merit function. Instead, linesearch in the approximate Newton direction is used to construct an appropriate hyperplane which separates the current iterate from the solution set. This step is followed by projecting the current iterate onto this hyperplane, which ensures global convergence of the algorithm. Computational cost of each iteration of our method is of the same order as that of the classical damped Newton method. The crucial advantage is that our method is truly globally convergent. In particular, it cannot get trapped in a stationary point of a merit function. The presented algorithm is motivated by the hybrid projection-proximal point method proposed in [25].

[1]  Jong-Shi Pang,et al.  An inexact NE/SQP method for solving the nonlinear complementarity problem , 1992, Comput. Optim. Appl..

[2]  James M. Ortega,et al.  Iterative solution of nonlinear equations in several variables , 2014, Computer science and applied mathematics.

[3]  F. Luque Asymptotic convergence analysis of the proximal point algorithm , 1984 .

[4]  T. Ypma Local Convergence of Inexact Newton Methods , 1984 .

[5]  Liqun Qi,et al.  Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations , 1993, Math. Oper. Res..

[6]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .

[7]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.

[8]  J. M. Martínez,et al.  Inexact Newton methods for solving nonsmooth equations , 1995 .

[9]  Michael C. Ferris,et al.  Finite termination of the proximal point algorithm , 1991, Math. Program..

[10]  Michael C. Ferris,et al.  Smooth methods of multipliers for complementarity problems , 1999, Math. Program..

[11]  O. Nelles,et al.  An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.

[12]  L. Armijo Minimization of functions having Lipschitz continuous first partial derivatives. , 1966 .

[13]  Houyuan Jiang,et al.  Global and Local Superlinear Convergence Analysis of Newton-Type Methods for Semismooth Equations with Smooth Least Squares , 1998 .

[14]  Jong-Shi Pang,et al.  Nonsmooth Equations: Motivation and Algorithms , 1993, SIAM J. Optim..

[15]  Houyuan Jiang,et al.  Semismoothness and Superlinear Convergence in Nonsmooth Optimization and Nonsmooth Equations , 1996 .

[16]  Benar Fux Svaiter,et al.  Forcing strong convergence of proximal point iterations in a Hilbert space , 2000, Math. Program..

[17]  E. Wagner International Series of Numerical Mathematics , 1963 .

[18]  J. M. Martínez,et al.  Local convergence theory of inexact Newton methods based on structured least change updates , 1990 .

[19]  R. Dembo,et al.  INEXACT NEWTON METHODS , 1982 .

[20]  M. Solodov,et al.  A New Projection Method for Variational Inequality Problems , 1999 .

[21]  Dimitri P. Bertsekas,et al.  On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators , 1992, Math. Program..

[22]  M. V. Solodovy,et al.  A Hybrid Projection{proximal Point Algorithm , 1998 .

[23]  M. V. Solodovy,et al.  Newton-type Methods with Generalized Distances For Constrained Optimization , 1997 .

[24]  R. Rockafellar Monotone Operators and the Proximal Point Algorithm , 1976 .

[25]  Liqun Qi,et al.  A nonsmooth version of Newton's method , 1993, Math. Program..