The effect of deterministic noise in subgradient methods

In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

[1]  E. A. Nurminskii Minimization of nondifferentiable functions in the presence of noise , 1974 .

[2]  Yuri Ermoliev,et al.  Stochastic Programming Methods , 1976 .

[3]  B. T. Poljak Nonlinear programming methods in the presence of noise , 1978, Math. Program..

[4]  Y. Ermoliev Stochastic quasigradient methods and their application to system optimization , 1983 .

[5]  Yuri Ermoliev,et al.  Numerical techniques for stochastic optimization , 1988 .

[6]  Yuri Ermoliev,et al.  Stochastic quasigradient methods. Numerical techniques for stochastic optimization , 1988 .

[7]  M. Ferris,et al.  Weak sharp minima in mathematical programming , 1993 .

[8]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1995 .

[9]  O. Nelles,et al.  An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.

[10]  Y. Censor,et al.  Inherently parallel algorithms in feasibility and applications , 1997 .

[11]  Jong-Shi Pang,et al.  Error bounds in mathematical programming , 1997, Math. Program..

[12]  M. Solodov,et al.  Error Stability Properties of Generalized Gradient-Type Algorithms , 1998 .

[13]  Jean-Louis Goffin,et al.  Convergence of a simple subgradient level method , 1999, Math. Program..

[14]  D. Bertsekas,et al.  Incremental subgradient methods for nondifferentiable optimization , 1999, Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304).

[15]  Vivek S. Borkar,et al.  Distributed Asynchronous Incremental Subgradient Methods , 2001 .

[16]  D. Bertsekas,et al.  Convergen e Rate of In remental Subgradient Algorithms , 2000 .

[17]  Arkadi Nemirovski,et al.  The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography , 2001, SIAM J. Optim..

[18]  S. Uryasev,et al.  Stochastic optimization : Algorithms and Applications , 2001 .

[19]  Angelia Nedic,et al.  Subgradient methods for convex minimization , 2002 .

[20]  Dimitri P. Bertsekas,et al.  Convex Analysis and Optimization , 2003 .

[21]  Krzysztof C. Kiwiel,et al.  Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization , 2003, SIAM J. Optim..

[22]  Robert D. Nowak,et al.  Quantized incremental algorithms for distributed optimization , 2005, IEEE Journal on Selected Areas in Communications.

[23]  Krzysztof C. Kiwiel,et al.  A Proximal Bundle Method with Approximate Subgradient Linearizations , 2006, SIAM J. Optim..

[24]  Giovanna Miglionico,et al.  An Incremental Method for Solving Convex Finite Min-Max Problems , 2006, Math. Oper. Res..

[25]  R. Srikant,et al.  Quantized Consensus , 2006, 2006 IEEE International Symposium on Information Theory.

[26]  T. C. Aysal,et al.  Distributed Average Consensus With Dithered Quantization , 2008, IEEE Transactions on Signal Processing.

[27]  Samir Elhedhli,et al.  Nondifferentiable Optimization , 2009, Encyclopedia of Optimization.

[28]  Yuri M. Ermoliev Stochastic Quasigradient Methods , 2009, Encyclopedia of Optimization.