Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

The goal of Sparse Convex Optimization is to optimize a convex function $f$ under a sparsity constraint $s\leq s^*\gamma$, where $s^*$ is the target number of non-zero entries in a feasible solution (sparsity) and $\gamma\geq 1$ is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number $\kappa$. The best known algorithms guarantee to find an approximate solution of value $f(x^*)+\epsilon$ with the sparsity bound of $\gamma = O\left(\kappa\min\left\{\log \frac{f(x^0)-f(x^*)}{\epsilon}, \kappa\right\}\right)$, where $x^*$ is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general $f$, under the condition $s > s^* \frac{\kappa^2}{4}$, which yields Compressed Sensing bounds under the Restricted Isometry Property (RIP). When compared to other Compressed Sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function $f$ that meets the RIP condition.

[1]  Lie Wang,et al.  Shifting Inequality and Recovery of Sparse Signals , 2010, IEEE Transactions on Signal Processing.

[2]  S. Foucart,et al.  Sparsest solutions of underdetermined linear systems via ℓq-minimization for 0 , 2009 .

[3]  Ping Li,et al.  On the Iteration Complexity of Support Recovery via Hard Thresholding Pursuit , 2017, ICML.

[4]  Bhiksha Raj,et al.  Greedy sparsity-constrained optimization , 2011, 2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR).

[5]  S. Foucart Sparse Recovery Algorithms: Sufficient Conditions in Terms of RestrictedIsometry Constants , 2012 .

[6]  Deanna Needell,et al.  Uniform Uncertainty Principle and Signal Recovery via Regularized Orthogonal Matching Pursuit , 2007, Found. Comput. Math..

[7]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[8]  Deanna Needell,et al.  Signal Recovery From Incomplete and Inaccurate Measurements Via Regularized Orthogonal Matching Pursuit , 2007, IEEE Journal of Selected Topics in Signal Processing.

[9]  Mike E. Davies,et al.  Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.

[10]  Jan-Olov Strömberg,et al.  On the Theorem of Uniform Recovery of Random Sampling Matrices , 2012, IEEE Transactions on Information Theory.

[11]  Ramin Ayanzadeh,et al.  A Survey on Compressive Sensing: Classical Results and Recent Advancements , 2019, ArXiv.

[12]  Simon Foucart,et al.  Hard Thresholding Pursuit: An Algorithm for Compressive Sensing , 2011, SIAM J. Numer. Anal..

[13]  Amin Karbasi,et al.  Weakly Submodular Maximization Beyond Cardinality Constraints: Does Randomization Help Greedy? , 2017, ICML.

[14]  Inderjit S. Dhillon,et al.  Orthogonal Matching Pursuit with Replacement , 2011, NIPS.

[15]  Emmanuel J. Cand The Restricted Isometry Property and Its Implications for Compressed Sensing , 2008 .

[16]  Prateek Jain,et al.  On Iterative Hard Thresholding Methods for High-dimensional M-Estimation , 2014, NIPS.

[17]  Ping Li,et al.  Partial Hard Thresholding: Towards A Principled Analysis of Support Recovery , 2017, NIPS.

[18]  Xiao-Tong Yuan,et al.  Exact Recovery of Hard Thresholding Pursuit , 2016, NIPS.

[19]  Prateek Jain,et al.  Support Recovery for Orthogonal Matching Pursuit: Upper and Lower bounds , 2018, NeurIPS.

[20]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[21]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[22]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[23]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[24]  Inderjit S. Dhillon,et al.  Partial Hard Thresholding , 2017, IEEE Transactions on Information Theory.

[25]  Lie Wang,et al.  New Bounds for Restricted Isometry Constants , 2009, IEEE Transactions on Information Theory.

[26]  T. Blumensath,et al.  On the Difference Between Orthogonal Matching Pursuit and Orthogonal Least Squares , 2007 .

[27]  Holger Rauhut,et al.  A Mathematical Introduction to Compressive Sensing , 2013, Applied and Numerical Harmonic Analysis.

[28]  Fan Chung Graham,et al.  Concentration Inequalities and Martingale Inequalities: A Survey , 2006, Internet Math..

[29]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[30]  S. Foucart A note on guaranteed sparse recovery via ℓ1-minimization , 2010 .

[31]  Tong Zhang,et al.  Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints , 2010, SIAM J. Optim..

[32]  Tong Zhang,et al.  Sparse Recovery With Orthogonal Matching Pursuit Under RIP , 2010, IEEE Transactions on Information Theory.

[33]  Jieping Ye,et al.  Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint , 2013, ICML.

[34]  Song Li,et al.  New bounds on the restricted isometry constant δ2k , 2011 .

[35]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[36]  Dean P. Foster,et al.  Variable Selection is Hard , 2014, COLT.

[37]  Alexandros G. Dimakis,et al.  Streaming Weak Submodularity: Interpreting Neural Networks on the Fly , 2017, NIPS.

[38]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[39]  Yin Hongpeng,et al.  Survey of compressed sensing , 2013 .

[40]  Aditya Bhaskara,et al.  Greedy Column Subset Selection: New Bounds and Distributed Algorithms , 2016, ICML.