Convergence rate analysis of proximal gradient methods with applications to composite minimization problems

First-order methods such as proximal gradient, which use Forward–Backward Splitting techniques have proved to be very effective in solving nonsmooth convex minimization problem, which is useful in solving various practical problems in different fields such as machine learning and image processing. In this paper, we propose few new forward–backward splitting algorithms, which consume less number of iterations to converge to an optimum. In addition, we derive convergence rates for the proposed formulations and show that the speed of convergence of these algorithms is significantly better than the traditional forward–backward algorithm. To demonstrate the practical applicability, we apply them to two real-world problems of machine learning and image processing. The first issue deals with the regression on high-dimensional datasets, whereas the second one is the image deblurring problem. Numerical experiments have been conducted on several publicly available real datasets to verify the obtained theoretical results. Results demonstrate the superiority of our algorithms in terms of accuracy, the number of iterations required to converge and the rate of convergence against the classical first-order methods.

[1]  Tosio Kato,et al.  Nonlinear semigroups and evolution equations , 1967 .

[2]  R. Rockafellar Monotone Operators and the Proximal Point Algorithm , 1976 .

[3]  David L. Donoho,et al.  De-noising by soft-thresholding , 1995, IEEE Trans. Inf. Theory.

[4]  Antonin Chambolle,et al.  Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage , 1998, IEEE Trans. Image Process..

[5]  Paul Tseng,et al.  A Modified Forward-backward Splitting Method for Maximal Monotone Mappings 1 , 1998 .

[6]  C. Zălinescu Convex analysis in general vector spaces , 2002 .

[7]  Robert D. Nowak,et al.  An EM algorithm for wavelet-based image restoration , 2003, IEEE Trans. Image Process..

[8]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[9]  Patrick L. Combettes * Solving monotone inclusions via compositions of nonexpansive averaged operators , 2004 .

[10]  Patrick L. Combettes,et al.  Signal Recovery by Proximal Forward-Backward Splitting , 2005, Multiscale Model. Simul..

[11]  Massimiliano Pontil,et al.  Convex multi-task feature learning , 2008, Machine Learning.

[12]  Yin Zhang,et al.  Fixed-Point Continuation for l1-Minimization: Methodology and Convergence , 2008, SIAM J. Optim..

[13]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[14]  Marc Teboulle,et al.  Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems , 2009, IEEE Transactions on Image Processing.

[15]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[16]  José A. Soto,et al.  On the rate of convergence of Krasnosel’skiĭ-Mann iterations and their connection with sums of Bernoullis , 2012, 1206.4195.

[17]  Jaime G. Carbonell,et al.  Multitask learning for host–pathogen protein interactions , 2013, Bioinform..

[18]  Zhenhua Guo,et al.  Face recognition by sparse discriminant analysis via joint L2, 1-norm minimization , 2014, Pattern Recognit..

[19]  X. Qin,et al.  A regularization method for treating zero points of the sum of two monotone operators , 2014 .

[20]  Xiaoming Yuan,et al.  A Generalized Proximal Point Algorithm and Its Convergence Rate , 2014, SIAM J. Optim..

[21]  Stephen P. Boyd,et al.  Proximal Algorithms , 2013, Found. Trends Optim..

[22]  Jen-Chih Yao,et al.  THE PROX-TIKHONOV-LIKE FORWARD-BACKWARD METHOD AND APPLICATIONS , 2015 .

[23]  D. R. Sahu,et al.  Convergence of Inexact Mann Iterations Generated by Nearly Nonexpansive Sequences and Applications , 2016 .

[24]  P. Maingé Strong convergence of projected reflected gradient methods for variational inequalities , 2018, Fixed Point Theory.

[25]  X. Qin,et al.  Strong convergence of an iterative algorithm involving nonlinear mappings of nonexpansive and accretive type , 2018, Optimization.

[26]  Hong-Kun Xu,et al.  A strongly convergent modification of the proximal point algorithm in nonsmooth Banach spaces , 2018 .

[27]  Xiaolong Qin,et al.  Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets , 2018, Computational Optimization and Applications.