A Projected Lagrangian Algorithm for Nonlinear $l_1 $ Optimization

The nonlinear $l_1 $ problem is an unconstrained optimization problem whose objective function is not differentiable everywhere, and hence cannot be solved efficiently using standard techniques for unconstrained optimization. The problem can be transformed into a nonlinearly constrained optimization problem, but it involves many extra variables. We show how to construct a method based on projected Lagrangian methods for constrained optimization which requires successively solving quadratic programs in the same number of variables as that of the original problem. Special Lagrange multiplier estimates are used to form an approximation to the Hessian of the Lagrangian function, which appears in the quadratic program. A special line search algorithm is used to obtain a reduction in the $l_1 $ objective function at each iteration. Under certain conditions the method is locally quadratically convergent if analytical Hessians are used.

[1]  J. Rice,et al.  The Lawson algorithm and extensions , 1968 .

[2]  Anthony V. Fiacco,et al.  Nonlinear programming;: Sequential unconstrained minimization techniques , 1968 .

[3]  M. R. Osborne,et al.  Methods for unconstrained optimization problems , 1968 .

[4]  Yonathan Bard,et al.  Comparison of Gradient Methods for the Solution of Nonlinear Parameter Estimation Problems , 1970 .

[5]  G. Alistair Watson,et al.  On an Algorithm for Discrete Nonlinear L1 Approximation , 1971, Comput. J..

[6]  J. Claerbout,et al.  Robust Modeling With Erratic Data , 1973 .

[7]  I. Barrodale,et al.  An Improved Algorithm for Discrete $l_1 $ Linear Approximation , 1973 .

[8]  Stephen M. Robinson,et al.  Perturbed Kuhn-Tucker points and rates of convergence for a class of nonlinear-programming algorithms , 1974, Math. Program..

[9]  Philip E. Gill,et al.  Newton-type methods for unconstrained and linearly constrained optimization , 1974, Math. Program..

[10]  Comments on computing minimum absolute deviations regressions by iterative least squares regressions and by linear programming , 1974 .

[11]  K. Madsen An Algorithm for Minimax Solution of Overdetermined Systems of Non-linear Equations , 1975 .

[12]  Shih-Ping Han A globally convergent method for nonlinear programming , 1975 .

[13]  A Defense of the Karst Algorithm for Finding the Line of Best Fit under the L1 Norm. , 1977 .

[14]  M. R. Osborne,et al.  Discrete, nonlinear approximation problems in polyhedral norms , 1977 .

[15]  T. Pietrzykowski,et al.  A Penalty Function Method Converging Directly to a Constrained Optimum , 1977 .

[16]  I. Barrodale,et al.  An Efficient Algorithm for Discrete $l_1$ Linear Approximation with Linear Constraints , 1978 .

[17]  M. J. D. Powell,et al.  THE CONVERGENCE OF VARIABLE METRIC METHODS FOR NONLINEARLY CONSTRAINED OPTIMIZATION CALCULATIONS , 1978 .

[18]  A. Conn,et al.  Minimization Techniques for Piecewise Differentiable Functions: The l_1 Solution to an Overdetermined Linear System , 1978 .

[19]  G. Watson The Minimax Solution of an Overdetermined System of Non-linear Equations , 1979 .

[20]  Philip E. Gill,et al.  The computation of Lagrange-multiplier estimates for constrained minimization , 1979, Math. Program..

[21]  Christakis Charalambous,et al.  On conditions for optimality of the nonlinearl1 problem , 1979, Math. Program..

[22]  M. Vidyasagar,et al.  An Algorithm for $l_1 $-Norm Minimization with Application to Nonlinear $l_1 $-Approximation , 1979 .

[23]  G. Watson,et al.  Numerical Methods for Nonlinear Discrete L1 Approximation Problems , 1980 .

[24]  G. Alistair Watson,et al.  First and second order conditions for a class of nondifferentiable optimization problems , 1980, Math. Program..

[25]  W. Murray,et al.  A Projected Lagrangian Algorithm for Nonlinear Minimax Optimization , 1980 .

[26]  S. P. Han,et al.  Variable metric methods for minimizing a class of nondifferentiable functions , 1977, Math. Program..