GAN-based Projector for Faster Recovery in Compressed Sensing with Convergence Guarantees

A Generative Adversarial Network (GAN) with generator $G$ trained to model the prior of images has been shown to perform better than sparsity-based regularizers in ill-posed inverse problems. In this work, we propose a new method of deploying a GAN-based prior to solve linear inverse problems using projected gradient descent (PGD). Our method learns a network-based projector for use in the PGD algorithm, eliminating the need for expensive computation of the Jacobian of $G$. Experiments show that our approach provides a speed-up of $30\text{-}40\times$ over earlier GAN-based recovery methods for similar accuracy in compressed sensing. Our main theoretical result is that if the measurement matrix is moderately conditioned for range($G$) and the projector is $\delta$-approximate, then the algorithm is guaranteed to reach $O(\delta)$ reconstruction error in $O(log(1/\delta))$ steps in the low noise regime. Additionally, we propose a fast method to design such measurement matrices for a given $G$. Extensive experiments demonstrate the efficacy of this method by requiring $5\text{-}10\times$ fewer measurements than random Gaussian measurement matrices for comparable recovery performance.

[1]  Yann LeCun,et al.  Learning Fast Approximations of Sparse Coding , 2010, ICML.

[2]  Alexandros G. Dimakis,et al.  Compressed Sensing using Generative Models , 2017, ICML.

[3]  Chinmay Hegde,et al.  NuMax: A Convex Approach for Learning Near-Isometric Linear Embeddings , 2015, IEEE Transactions on Signal Processing.

[4]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[5]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[6]  Felipe Cucker,et al.  Foundations of Computational Mathematics , 1997 .

[7]  Brendt Wohlberg,et al.  Plug-and-Play priors for model based reconstruction , 2013, 2013 IEEE Global Conference on Signal and Information Processing.

[8]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[9]  Jonas Adler,et al.  Solving ill-posed inverse problems using iterative deep neural networks , 2017, ArXiv.

[10]  William IEEE TRANSACTIONS ON INFORMATION THEORY VOL XX NO Y MONTH Signal Propagation and Noisy Circuits , 2019 .

[11]  Chinmay Hegde,et al.  Solving Linear Inverse Problems Using Gan Priors: An Algorithm with Provable Guarantees , 2018, 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[12]  Michael F. Insana IEEE Transactions on Medical Imaging information for authors , 2016 .

[13]  J. Coyle Inverse Problems , 2004 .

[14]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[15]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[16]  Chun-Liang Li,et al.  One Network to Solve Them All — Solving Linear Inverse Problems Using Deep Projection Models , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[17]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[18]  Rushil Anirudh,et al.  An Unsupervised Approach to Solving Inverse Problems using Generative Adversarial Networks , 2018, ArXiv.

[19]  Tom White,et al.  Generative Adversarial Networks: An Overview , 2017, IEEE Signal Processing Magazine.