Generalized Mirror Prox for Monotone Variational Inequalities: Universality and Inexact Oracle

We introduce an inexact oracle model for variational inequalities (VI) with monotone operator, propose a numerical method which solves such VI's and analyze its convergence rate. As a particular case, we consider VI's with H\"older-continuous operator and show that our algorithm is universal. This means that without knowing the H\"older parameter $\nu$ and H\"older constant $L_{\nu}$ it has the best possible complexity for this class of VI's, namely our algorithm has complexity $O\left( \inf_{\nu\in[0,1]}\left(\frac{L_{\nu}}{\varepsilon} \right)^{\frac{2}{1+\nu}}R^2 \right)$, where $R$ is the size of the feasible set and $\varepsilon$ is the desired accuracy of the solution. We also consider the case of VI's with strongly monotone operator and generalize our method for VI's with inexact oracle and our universal method for this class of problems. Finally, we show, how our method can be applied to convex-concave saddle point problems with H\"older-continuous partial subgradients.

[1]  G. M. Korpelevich The extragradient method for finding saddle points and other problems , 1976 .

[2]  John Darzentas,et al.  Problem Complexity and Method Efficiency in Optimization , 1983 .

[3]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[4]  Patrick T. Harker,et al.  Finite-dimensional variational inequality and nonlinear complementarity problems: A survey of theory, algorithms and applications , 1990, Math. Program..

[5]  F. Giannessi On Minty Variational Principle , 1998 .

[6]  M. Solodov,et al.  A Hybrid Approximate Extragradient – Proximal Point Algorithm Using the Enlargement of a Maximal Monotone Operator , 1999 .

[7]  F. Facchinei,et al.  Finite-Dimensional Variational Inequalities and Complementarity Problems , 2003 .

[8]  Arkadi Nemirovski,et al.  Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems , 2004, SIAM J. Optim..

[9]  Marc Teboulle,et al.  Interior projection-like methods for monotone variational inequalities , 2005, Math. Program..

[10]  Yurii Nesterov,et al.  Solving Strongly Monotone Variational and Quasi-Variational Inequalities , 2006 .

[11]  Yurii Nesterov,et al.  Dual extrapolation and its applications to solving variational inequalities and related problems , 2003, Math. Program..

[12]  Alexandre d'Aspremont,et al.  Smooth Optimization with Approximate Gradient , 2005, SIAM J. Optim..

[13]  Renato D. C. Monteiro,et al.  On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean , 2010, SIAM J. Optim..

[14]  Angelia Nedic,et al.  Multiuser Optimization: Distributed Algorithms and Error Analysis , 2011, SIAM J. Optim..

[15]  Phan Tu Vuong,et al.  Modified projection method for strongly pseudomonotone variational inequalities , 2014, J. Glob. Optim..

[16]  P. E. Dvurechensky,et al.  Algorithms for computing Minkowski operators and their application in differential games , 2014 .

[17]  Yurii Nesterov,et al.  First-order methods of smooth convex optimization with inexact oracle , 2013, Mathematical Programming.

[18]  Guanghui Lan,et al.  On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators , 2013, Comput. Optim. Appl..

[19]  Yurii Nesterov,et al.  Universal gradient methods for convex optimization problems , 2015, Math. Program..

[20]  Yurii Nesterov,et al.  Primal-Dual Methods for Solving Infinite-Dimensional Games , 2015, J. Optim. Theory Appl..

[21]  Gleb Gusev,et al.  Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods , 2016, NIPS.

[22]  Alexander Gasnikov,et al.  Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle , 2016, Journal of Optimization Theory and Applications.

[23]  A. V. Gasnikov,et al.  Stochastic intermediate gradient method for convex optimization problems , 2016 .

[24]  P. Dvurechensky,et al.  Universal intermediate gradient method for convex problems with inexact oracle , 2017, Optim. Methods Softw..

[25]  Alexander Gasnikov,et al.  Universal similar triangulars method for searching equilibriums in traffic flow distribution models , 2017 .

[26]  P. Dvurechensky Gradient Method With Inexact Oracle for Composite Non-Convex Optimization , 2017, 1703.09180.

[27]  A. V. Gasnikov,et al.  Universal Method for Stochastic Composite Optimization Problems , 2018 .

[28]  Michael Cohen,et al.  On Acceleration with Noise-Corrupted Gradients , 2018, ICML.

[29]  Eduard A. Gorbunov,et al.  An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization , 2018, SIAM J. Optim..

[30]  Alexander Gasnikov,et al.  Mirror Descent and Convex Optimization Problems with Non-smooth Inequality Constraints , 2017, 1710.06612.

[31]  Yu Malitsky,et al.  Proximal extrapolated gradient methods for variational inequalities , 2016, Optim. Methods Softw..

[32]  Alexander Gasnikov,et al.  Primal–dual accelerated gradient methods with small-dimensional relaxation oracle , 2018, Optim. Methods Softw..

[33]  Anastasia A. Lagunovskaya,et al.  Universal Method of Searching for Equilibria and Stochastic Equilibria in Transportation Networks , 2019, Computational Mathematics and Mathematical Physics.

[34]  A. Gasnikov,et al.  Decentralized and Parallelized Primal and Dual Accelerated Methods for Stochastic Convex Programming Problems , 2019, 1904.09015.

[35]  Alexander Gasnikov,et al.  Gradient Methods for Problems with Inexact Model of the Objective , 2019, MOTOR.

[36]  Yin Tat Lee,et al.  Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives , 2019, COLT.

[37]  Mathias Staudigl,et al.  Forward-backward-forward methods with variance reduction for stochastic variational inequalities , 2019, ArXiv.

[38]  Anton Anikin,et al.  A universal modification of the linear coupling method , 2019, Optim. Methods Softw..

[39]  Y. Nesterov,et al.  Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems , 2019, Doklady Mathematics.

[40]  A. V. Gasnikov,et al.  An Adaptive Proximal Method for Variational Inequalities , 2019, Computational Mathematics and Mathematical Physics.

[41]  Francis Bach,et al.  A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise , 2019, COLT.

[42]  Saeed Ghadimi,et al.  Generalized Uniformly Optimal Methods for Nonlinear Programming , 2015, Journal of Scientific Computing.

[43]  P. Dvurechensky,et al.  Tensor methods for strongly convex strongly concave saddle point problems and strongly monotone variational inequalities , 2020, Computer Research and Modeling.

[44]  A. Gasnikov,et al.  Accelerated Methods for Saddle-Point Problem , 2020, Computational Mathematics and Mathematical Physics.

[45]  Asuman E. Ozdaglar,et al.  Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions , 2018, SIAM J. Optim..

[46]  Alexander Gasnikov,et al.  Zeroth-order methods for noisy Hölder-gradient functions , 2020, Optimization Letters.

[47]  Vladimir A. Knyaz,et al.  Adversarial Dataset Augmentation Using Reinforcement Learning and 3D Modeling , 2020 .

[48]  On the line-search gradient methods for stochastic optimization , 2020 .

[49]  Kevin A. Lai,et al.  Higher-order methods for convex-concave min-max optimization and monotone variational inequalities , 2020, SIAM J. Optim..