An Algorithm for Graph-Fused Lasso Based on Graph Decomposition

This work proposes a new algorithm for solving the graph-fused lasso (GFL), a method for parameter estimation that operates under the assumption that the signal tends to be locally constant over a predefined graph structure. The proposed method applies the alternating direction method of multipliers (ADMM) algorithm and is based on the decomposition of the objective function into two components. While ADMM has been widely used in this problem, existing works such as network lasso decompose the objective function into the loss function component and the total variation penalty component. In comparison, this work proposes to decompose the objective function into two components, where one component is the loss function plus part of the total variation penalty, and the other component is the remaining total variation penalty. Compared with the network lasso algorithm, this method has a smaller computational cost per iteration and converges faster in most simulations numerically.

[1]  Xiaodong Lin,et al.  Alternating linearization for structured regularization problems , 2011, J. Mach. Learn. Res..

[2]  Stephen P. Boyd,et al.  Network Lasso: Clustering and Optimization in Large Graphs , 2015, KDD.

[3]  P. Davies,et al.  Local Extremes, Runs, Strings and Multiresolution , 2001 .

[4]  Yunzhang Zhu An Augmented ADMM Algorithm With Application to the Generalized Lasso Problem , 2017 .

[5]  R. Tibshirani,et al.  Sparsity and smoothness via the fused lasso , 2005 .

[6]  A. Kovac,et al.  Nonparametric Regression on a Graph , 2011 .

[7]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[8]  Nicholas A. Johnson,et al.  A Dynamic Programming Algorithm for the Fused Lasso and L 0-Segmentation , 2013 .

[9]  Vladimir Kolmogorov,et al.  Total Variation on a Tree , 2015, SIAM J. Imaging Sci..

[10]  R. Tibshirani,et al.  PATHWISE COORDINATE OPTIMIZATION , 2007, 0708.1485.

[11]  Xi Chen,et al.  Smoothing proximal gradient method for general structured sparse regression , 2010, The Annals of Applied Statistics.

[12]  Guillaume Obozinski,et al.  Cut Pursuit: Fast Algorithms to Learn Piecewise Constant Functions , 2016, AISTATS.

[13]  Stephen P. Boyd,et al.  An ADMM Algorithm for a Class of Total Variation Regularized Estimation Problems , 2012, 1203.1828.

[14]  Jieping Ye,et al.  An efficient algorithm for a class of fused lasso problems , 2010, KDD.

[15]  Ryan J. Tibshirani,et al.  Efficient Implementations of the Generalized Lasso Dual Path Algorithm , 2014, ArXiv.

[16]  L. Rudin,et al.  Nonlinear total variation based noise removal algorithms , 1992 .

[17]  Laurent Condat,et al.  A Direct Algorithm for 1-D Total Variation Denoising , 2013, IEEE Signal Processing Letters.

[18]  Ryan J. Tibshirani,et al.  Fast and Flexible ADMM Algorithms for Trend Filtering , 2014, ArXiv.

[19]  Xiaohui Xie,et al.  Split Bregman method for large scale fused Lasso , 2010, Comput. Stat. Data Anal..

[20]  Suvrit Sra,et al.  Modular Proximal Optimization for Multidimensional Total-Variation Regularization , 2014, J. Mach. Learn. Res..

[21]  Antonin Chambolle,et al.  On Total Variation Minimization and Surface Evolution Using Parametric Maximum Flows , 2009, International Journal of Computer Vision.

[22]  Zheng Xu,et al.  Adaptive ADMM with Spectral Penalty Parameter Selection , 2016, AISTATS.

[23]  Jean-Philippe Vert,et al.  The group fused Lasso for multiple change-point detection , 2011, 1106.4199.

[24]  James G. Scott,et al.  A Fast and Flexible Algorithm for the Graph-Fused Lasso , 2015, 1505.06475.

[25]  Dimitri P. Bertsekas,et al.  On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators , 1992, Math. Program..

[26]  Sungroh Yoon,et al.  High-Dimensional Fused Lasso Regression Using Majorization–Minimization and Parallel Processing , 2013, 1306.1970.