Gradient Based Restart FISTA

Fast gradient methods (FGM) are very popular in the field of large scale convex optimization problems. Recently, it has been shown that restart strategies can guarantee global linear convergence for non-strongly convex optimization problems if a quadratic functional growth condition is satisfied [1], [2]. In this context, a novel restart FGM algorithm with global linear convergence is proposed in this paper. The main advantages of the algorithm with respect to other linearly convergent restart FGM algorithms are its simplicity and that it does not require prior knowledge of the optimal value of the objective function or of the quadratic functional growth parameter. We present some numerical simulations that illustrate the performance of the algorithm.

[1]  Daniel Limón,et al.  Implementation of Model Predictive Controllers in Programmable Logic Controllers using IEC 61131-3 standard , 2018, 2018 European Control Conference (ECC).

[2]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[3]  Amir Beck,et al.  First-Order Methods in Optimization , 2017 .

[4]  Jeffrey A. Fessler,et al.  Adaptive Restart of the Optimized Gradient Method for Convex Optimization , 2017, J. Optim. Theory Appl..

[5]  Paul J. Goulart,et al.  Tight Global Linear Convergence Rate Bounds for Operator Splitting Methods , 2018, IEEE Transactions on Automatic Control.

[6]  Rolf Findeisen,et al.  A fast gradient method for embedded linear predictive control , 2011 .

[7]  Yurii Nesterov,et al.  Gradient methods for minimizing composite functions , 2012, Mathematical Programming.

[8]  Mazen Alamir,et al.  Monitoring control updating period in fast gradient based NMPC , 2012, 2013 European Control Conference (ECC).

[9]  Yurii Nesterov,et al.  Linear convergence of first order methods for non-strongly convex optimization , 2015, Math. Program..

[10]  Teodoro Alamo,et al.  Restart FISTA with Global Linear Convergence , 2019, 2019 18th European Control Conference (ECC).

[11]  Yurii Nesterov,et al.  Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.

[12]  Manfred Morari,et al.  Computational Complexity Certification for Real-Time MPC With Input Constraints Based on the Fast Gradient Method , 2012, IEEE Transactions on Automatic Control.

[13]  Chih-Jen Lin,et al.  Iteration complexity of feasible descent methods for convex optimization , 2014, J. Mach. Learn. Res..

[14]  Stephen P. Boyd,et al.  Monotonicity and restart in fast gradient methods , 2014, 53rd IEEE Conference on Decision and Control.

[15]  Patrick L. Combettes,et al.  Proximal Splitting Methods in Signal Processing , 2009, Fixed-Point Algorithms for Inverse Problems in Science and Engineering.

[16]  Z.-Q. Luo,et al.  Error bounds and convergence analysis of feasible descent methods: a general approach , 1993, Ann. Oper. Res..

[17]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[18]  Emmanuel J. Candès,et al.  Adaptive Restart for Accelerated Gradient Schemes , 2012, Foundations of Computational Mathematics.

[19]  Yurii Nesterov,et al.  Smooth minimization of non-smooth functions , 2005, Math. Program..

[20]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[21]  Stephen P. Boyd,et al.  Proximal Algorithms , 2013, Found. Trends Optim..

[22]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.