Compositions of convex functions and fully linear models

Derivative-free optimization (DFO) is the mathematical study of the optimization algorithms that do not use derivatives. One branch of DFO focuses on model-based DFO methods, where an approximation of the objective function is used to guide the optimization algorithm. Proving convergence of such methods often applies an assumption that the approximations form fully linear models—an assumption that requires the true objective function to be smooth. However, some recent methods have loosened this assumption and instead worked with functions that are compositions of smooth functions with simple convex functions (the max-function or the $$\ell _1$$ℓ1 norm). In this paper, we examine the error bounds resulting from the composition of a convex lower semi-continuous function with a smooth vector-valued function when it is possible to provide fully linear models for each component of the vector-valued function. We derive error bounds for the resulting function values and subgradient vectors.

[1]  Warren Hare,et al.  A derivative-free approximate gradient sampling algorithm for finite minimax problems , 2013, Computational Optimization and Applications.

[2]  Stefan M. Wild,et al.  Manifold Sampling for ℓ1 Nonconvex Optimization , 2016, SIAM J. Optim..

[3]  M. J. D. Powell,et al.  UOBYQA: unconstrained optimization by quadratic approximation , 2002, Math. Program..

[4]  Warren Hare,et al.  Derivative-Free Optimization Via Proximal Point Methods , 2014, J. Optim. Theory Appl..

[5]  Katya Scheinberg,et al.  Geometry of interpolation sets in derivative free optimization , 2007, Math. Program..

[6]  L. N. Vicente,et al.  Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation , 2008 .

[7]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[8]  W. Hare,et al.  Optimizing damper connectors for adjacent buildings , 2015, 1511.02182.

[9]  Christine A. Shoemaker,et al.  Global Convergence of Radial Basis Function Trust Region Derivative-Free Algorithms , 2011, SIAM J. Optim..

[10]  Luís N. Vicente,et al.  Using Sampling and Simplex Derivatives in Pattern Search Methods , 2007, SIAM J. Optim..

[11]  Katya Scheinberg,et al.  Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points , 2009, SIAM J. Optim..

[12]  Katya Scheinberg,et al.  On the convergence of derivative-free methods for unconstrained optimization , 1997 .

[13]  M. Powell Developments of NEWUOA for minimization without derivatives , 2008 .

[14]  Heinz H. Bauschke,et al.  A derivative-free comirror algorithm for convex optimization , 2015, Optim. Methods Softw..

[15]  Katya Scheinberg,et al.  Introduction to derivative-free optimization , 2010, Math. Comput..

[16]  Rommel G. Regis,et al.  The calculus of simplex gradients , 2015, Optim. Lett..

[17]  M. J. D. Powell,et al.  On trust region methods for unconstrained minimization without derivatives , 2003, Math. Program..