Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems

Recently some specific classes of non-smooth and non-Lipschitz convex optimization problems were selected by Yu.~Nesterov along with H.~Lu. We consider convex programming problems with similar smoothness conditions for the objective function and functional constraints. We introduce a new concept of an inexact model and propose some analogues of switching subgradient schemes for convex programming problems for the relatively Lipschitz-continuous objective function and functional constraints. Some class of online convex optimization problems is considered. The proposed methods are optimal in the class of optimization problems with relatively Lipschitz-continuous objective and functional constraints.

[1]  Yurii Nesterov,et al.  Gradient methods for minimizing composite functions , 2012, Mathematical Programming.

[2]  Yurii Nesterov,et al.  Relatively Smooth Convex Optimization by First-Order Methods, and Applications , 2016, SIAM J. Optim..

[3]  Elad Hazan,et al.  Introduction to Online Convex Optimization , 2016, Found. Trends Optim..

[4]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[5]  Xiaohan Wei,et al.  Online Convex Optimization with Stochastic Constraints , 2017, NIPS.

[6]  Berç Rustem,et al.  A Weighted Mirror Descent Algorithm for Nonsmooth Convex Optimization Problem , 2016, J. Optim. Theory Appl..

[7]  Stephen P. Boyd,et al.  On the Convergence of Mirror Descent beyond Stochastic Convex Programming , 2017, SIAM J. Optim..

[8]  Jianjun Yuan,et al.  Online Convex Optimization for Cumulative Constraints , 2018, NeurIPS.

[9]  Alexander Gasnikov,et al.  Mirror Descent and Convex Optimization Problems with Non-smooth Inequality Constraints , 2017, 1710.06612.

[10]  Elad Hazan,et al.  An optimal algorithm for stochastic strongly-convex optimization , 2010, 1006.2425.

[11]  Alexander Gasnikov,et al.  Fast gradient descent method for convex optimization problems with an oracle that generates a $(\delta,L)$-model of a function in a requested point , 2017, 1711.02747.

[12]  Stonyakin Fedyor Sergeevich,et al.  Mirror descent for constrained optimization problems with large subgradient values of functional constraints , 2019, Computer Research and Modeling.

[13]  Alexander Gasnikov,et al.  Mirror Descent and Constrained Online Optimization Problems , 2018, Communications in Computer and Information Science.

[14]  Kimon Antonakopoulos,et al.  Online and stochastic optimization beyond Lipschitz continuity: A Riemannian approach , 2020, ICLR.

[15]  Alexander V. Nazin,et al.  Application of the Mirror Descent Method to minimize average losses coming by a poisson flow , 2014, 2014 European Control Conference (ECC).

[16]  Peter Richtarik,et al.  Accelerated Bregman proximal gradient methods for relatively smooth convex optimization , 2018, Computational Optimization and Applications.

[17]  Koby Crammer,et al.  A generalized online mirror descent with applications to classification and regression , 2013, Machine Learning.

[18]  Alexander V. Nazin,et al.  Extension of a saddle point mirror descent algorithm with application to robust PageRank , 2013, 52nd IEEE Conference on Decision and Control.

[19]  Thinh T. Doan,et al.  Convergence of the Iterates in Mirror Descent Methods , 2018, IEEE Control Systems Letters.

[20]  Francesco Orabona A Modern Introduction to Online Learning , 2019, ArXiv.

[21]  Yunwen Lei,et al.  Convergence of online mirror descent , 2018 .

[22]  Boris M. Miller,et al.  Mirror Descent Algorithm for Homogeneous Finite Controlled Markov Chains with Unknown Mean Losses , 2011 .

[23]  Alexander Titov,et al.  Mirror descent for constrained optimization problems with large subgradient values of functional constraints , 2019, Computer Research and Modeling.

[24]  Yurii Nesterov,et al.  First-order methods of smooth convex optimization with inexact oracle , 2013, Mathematical Programming.

[25]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[26]  Mohammad S. Alkousa,et al.  On some stochastic mirror descent methods for constrained online optimization problems , 2019, Computer Research and Modeling.

[27]  John Darzentas,et al.  Problem Complexity and Method Efficiency in Optimization , 1983 .

[28]  M. Alkousa,et al.  Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints , 2018, Journal of Applied and Industrial Mathematics.

[29]  Marc Teboulle,et al.  Mirror descent and nonlinear projected subgradient methods for convex optimization , 2003, Oper. Res. Lett..

[30]  Cédric Archambeau,et al.  Adaptive Algorithms for Online Convex Optimization with Long-term Constraints , 2015, ICML.

[31]  Alexander V. Gasnikov,et al.  Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case , 2017, Autom. Remote. Control..

[32]  Marc Teboulle,et al.  A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications , 2017, Math. Oper. Res..

[33]  Amir Beck,et al.  The CoMirror algorithm for solving nonsmooth constrained convex problems , 2010, Oper. Res. Lett..

[34]  Haihao Lu “Relative Continuity” for Non-Lipschitz Nonsmooth Convex Optimization Using Stochastic (or Deterministic) Mirror Descent , 2017, INFORMS Journal on Optimization.