A Newton Method for Convex Regression, Data Smoothing, and Quadratic Programming with Bounded Constraints

This paper formulates systems of piecewise linear equations, derived from the Karush–Kuhn–Tucker conditions for constrained convex optimization problems, as unconstrained minimization problems in which the objective function is a multivariate quadratic spline. Such formulations provide new ways of developing efficient algorithms for many optimization problems, such as the convex regression problem, the least-distance problem, the symmetric monotone linear complementarily problem, and the convex quadratic programming problem with bounded constraints. Theoretical results, a description of an algorithm and its implementation, and numerical results are presented along with a stability analysis.