Cyclic stochastic optimization via arbitrary selection procedures for updating parameters

Many of the algorithms that exist to tackle optimization problems (either stochastic or deterministic) are iterative in nature. Methods where only a subset of the parameter vector is updated each time have been frequently used in practice. While some of these methods update different sets of parameters according to some predetermined pattern, others select the parameters to update according to a random variable. Much work exists on the convergence of such procedures in the case of deterministic optimization. However, very little is known about their convergence when applied to general stochastic optimization problems; this is the setting this work focuses on. We describe the generalized cyclic seesaw algorithm-a general method for selecting which parameters to update during each iteration-and give sufficient conditions for its convergence.

[1]  R. Srikant,et al.  Random Block-Coordinate Gradient Projection Algorithms , 2014, 53rd IEEE Conference on Decision and Control.

[2]  James C. Spall,et al.  Cyclic Seesaw Process for Optimization and Identification , 2012, J. Optim. Theory Appl..

[3]  Athina P. Petropulu,et al.  Efficient target estimation in distributed MIMO radar via the ADMM , 2014, 2014 48th Annual Conference on Information Sciences and Systems (CISS).

[4]  Ion Necoara,et al.  Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks , 2013, IEEE Transactions on Automatic Control.

[5]  James C. Spall,et al.  A model-free approach to optimal signal light timing for system-wide traffic control , 1994, Proceedings of 1994 33rd IEEE Conference on Decision and Control.

[6]  James C. Spall,et al.  Introduction to Stochastic Search and Optimization. Estimation, Simulation, and Control (Spall, J.C. , 2007 .

[7]  I-Jeng Wang,et al.  Stochastic optimisation with inequality constraints using simultaneous perturbations and penalty functions , 2008, Int. J. Control.

[8]  James C. Spall,et al.  Cyclic stochastic optimization with noisy function measurements , 2014, 2014 American Control Conference.

[9]  Harold J. Kushner,et al.  wchastic. approximation methods for constrained and unconstrained systems , 1978 .

[10]  L. A. Prashanth,et al.  Stochastic Recursive Algorithms for Optimization: Simultaneous Perturbation Methods , 2012 .

[11]  V. Borkar Asynchronous Stochastic Approximations , 1998 .

[12]  Mikhail Borisovich Nevelʹson,et al.  Stochastic Approximation and Recursive Estimation , 1976 .

[13]  J. Fort,et al.  Generic Stochastic Gradient Methods , 2013 .

[14]  James C. Spall,et al.  Simulation-based examination of the limits of performance for decentralized multi-agent surveillance and tracking of undersea targets , 2014, Defense + Security Symposium.

[15]  James C. Spall,et al.  AN OVERVIEW OF THE SIMULTANEOUS PERTURBATION METHOD FOR EFFICIENT OPTIMIZATION , 1998 .

[16]  W. Rudin Principles of mathematical analysis , 1964 .

[17]  Seok Lee,et al.  Cyclic optimization algorithms for simultaneous structure and motion recovery in computer vision , 2008 .

[18]  J. Spall Multivariate stochastic approximation using a simultaneous perturbation gradient approximation , 1992 .

[19]  Phillipp Meister,et al.  Stochastic Recursive Algorithms For Optimization Simultaneous Perturbation Methods , 2016 .

[20]  James C. Bezdek,et al.  Convergence of Alternating Optimization , 2003, Neural Parallel Sci. Comput..

[21]  Tim Hesterberg,et al.  Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control , 2004, Technometrics.