Provable Compressed Sensing With Generative Priors via Langevin Dynamics

Deep generative models have emerged as a powerful class of priors for signals in various inverse problems such as compressed sensing, phase retrieval and super-resolution. Here, we assume an unknown signal to lie in the range of some pre-trained generative model. A popular approach for signal recovery is via gradient descent in the low-dimensional latent space. While gradient descent has achieved good empirical performance, its theoretical behavior is not well understood. In this paper, we introduce the use of stochastic gradient Langevin dynamics (SGLD) for compressed sensing with a generative prior. Under mild assumptions on the generative model, we prove the convergence of SGLD to the true signal. We also demonstrate competitive empirical performance to standard gradient descent.

[1]  Timothy Lillicrap,et al.  Deep Compressed Sensing , 2019, ICML.

[2]  Pascal Vincent,et al.  Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..

[3]  Xiaoou Tang,et al.  Image Super-Resolution Using Deep Convolutional Networks , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Eric Price,et al.  Compressed Sensing with Approximate Priors via Conditional Resampling , 2020 .

[5]  Volkan Cevher,et al.  Fast and Provable ADMM for Learning with Generative Priors , 2019, NeurIPS.

[6]  Chinmay Hegde,et al.  Solving Linear Inverse Problems Using Gan Priors: An Algorithm with Provable Guarantees , 2018, 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[7]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[8]  Vladislav Voroninski,et al.  Global Guarantees for Enforcing Deep Generative Priors by Empirical Risk , 2017, IEEE Transactions on Information Theory.

[9]  Alexandros G. Dimakis,et al.  Deep Learning Techniques for Inverse Problems in Imaging , 2020, IEEE Journal on Selected Areas in Information Theory.

[10]  Yee Whye Teh,et al.  Bayesian Learning via Stochastic Gradient Langevin Dynamics , 2011, ICML.

[11]  Zhaoqiang Liu,et al.  Information-Theoretic Lower Bounds for Compressive Sensing With Generative Models , 2019, IEEE Journal on Selected Areas in Information Theory.

[12]  Matus Telgarsky,et al.  Non-convex learning via Stochastic Gradient Langevin Dynamics: a nonasymptotic analysis , 2017, COLT.

[13]  A. Dimakis,et al.  Compressed Sensing with Invertible Generative Models and Dependent Noise , 2020, ArXiv.

[14]  Alexandros G. Dimakis,et al.  Inverting Deep Generative models, One layer at a time , 2019, NeurIPS.

[15]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[16]  Yuchen Zhang,et al.  A Hitting Time Analysis of Stochastic Gradient Langevin Dynamics , 2017, COLT.

[17]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[18]  Santosh S. Vempala,et al.  Convergence rate of Riemannian Hamiltonian Monte Carlo and faster polytope volume computation , 2017, STOC.

[19]  Quanquan Gu,et al.  Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling , 2021, UAI.

[20]  László Lovász,et al.  Random Walks on Graphs: A Survey , 1993 .

[21]  Chinmay Hegde,et al.  Algorithmic Guarantees for Inverse Imaging with Untrained Network Priors , 2019, NeurIPS.

[22]  D. Bakry,et al.  A simple proof of the Poincaré inequality for a large class of probability measures , 2008 .

[23]  Yuqi Li,et al.  GAN-Based Projector for Faster Recovery With Convergence Guarantees in Linear Inverse Problems , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[24]  Alexandros G. Dimakis,et al.  Compressed Sensing using Generative Models , 2017, ICML.

[25]  Lawrence Carin,et al.  Bayesian compressive sensing and projection optimization , 2007, ICML '07.

[26]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[27]  Andrea Vedaldi,et al.  Deep Image Prior , 2017, International Journal of Computer Vision.

[28]  Alexandros G. Dimakis,et al.  Conditional Sampling from Invertible Generative Models with Applications to Inverse Problems , 2020, ArXiv.

[29]  Volkan Cevher,et al.  Model-Based Compressive Sensing , 2008, IEEE Transactions on Information Theory.

[30]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[31]  Ali Ahmed,et al.  Invertible generative models for inverse problems: mitigating representation error and dataset bias , 2019, ICML.

[32]  Santosh S. Vempala,et al.  The geometry of logconcave functions and sampling algorithms , 2007, Random Struct. Algorithms.

[33]  Tom White,et al.  Sampling Generative Networks: Notes on a Few Effective Techniques , 2016, ArXiv.

[34]  J. Hale Asymptotic Behavior of Dissipative Systems , 1988 .

[35]  Ali Mousavi,et al.  Learning to invert: Signal recovery via Deep Convolutional Networks , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[36]  Chun-Liang Li,et al.  One Network to Solve Them All — Solving Linear Inverse Problems Using Deep Projection Models , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[37]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.