Exploiting Restricted Boltzmann Machines and Deep Belief Networks in Compressed Sensing

This paper proposes a CS scheme that exploits the representational power of restricted Boltzmann machines and deep learning architectures to model the prior distribution of the sparsity pattern of signals belonging to the same class. The determined probability distribution is then used in a maximum a posteriori approach for the reconstruction. The parameters of the prior distribution are learned from training data. The motivation behind this approach is to model the higher–order statistical dependencies between the coefficients of the sparse representation, with the final goal of improving the reconstruction. The performance of the proposed method is validated on the Berkeley Segmentation Dataset and the MNIST Database of handwritten digits.

[1]  Razvan Pascanu,et al.  Learning Algorithms for the Classification Restricted Boltzmann Machine , 2012, J. Mach. Learn. Res..

[2]  Michael Elad,et al.  Optimized Projections for Compressed Sensing , 2007, IEEE Transactions on Signal Processing.

[3]  Guillermo Sapiro,et al.  Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization , 2009, IEEE Transactions on Image Processing.

[4]  Geoffrey E. Hinton,et al.  Deep Boltzmann Machines , 2009, AISTATS.

[5]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[6]  Florent Krzakala,et al.  Approximate message passing with restricted Boltzmann machine priors , 2015, ArXiv.

[7]  Lawrence Carin,et al.  Tree-Structured Compressive Sensing With Variational Bayesian Analysis , 2010, IEEE Signal Processing Letters.

[8]  Michael Elad,et al.  A Statistical Prediction Model Based on Sparse Representations for Single Image Super-Resolution , 2014, IEEE Transactions on Image Processing.

[9]  Cishen Zhang,et al.  Variational Bayesian Algorithm for Quantized Compressed Sensing , 2012, IEEE Transactions on Signal Processing.

[10]  C. Geyer Markov Chain Monte Carlo Maximum Likelihood , 1991 .

[11]  Nicolas Le Roux,et al.  Representational Power of Restricted Boltzmann Machines and Deep Belief Networks , 2008, Neural Computation.

[12]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[13]  Masato Okada,et al.  Sparse approximation based on a random overcomplete basis , 2015, 1510.02189.

[14]  Yonina C. Eldar,et al.  Exploiting Statistical Dependencies in Sparse Representations for Signal Recovery , 2010, IEEE Transactions on Signal Processing.

[15]  Lawrence Carin,et al.  Bayesian Compressive Sensing , 2008, IEEE Transactions on Signal Processing.

[16]  Jitendra Malik,et al.  A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[17]  Padhraic Smyth,et al.  Learning with Blocks: Composite Likelihood and Contrastive Divergence , 2010, AISTATS.

[18]  Volkan Cevher,et al.  Sparse Signal Recovery Using Markov Random Fields , 2008, NIPS.

[19]  Matthias W. Seeger,et al.  Compressed sensing and Bayesian experimental design , 2008, ICML '08.

[20]  Julien Cohen-Adad,et al.  Accelerated diffusion spectrum imaging with compressed sensing using adaptive dictionaries , 2012, MICCAI.

[21]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..

[22]  Laurent Daudet,et al.  Boltzmann Machine and Mean-Field Approximation for Structured Sparse Decompositions , 2012, IEEE Transactions on Signal Processing.

[23]  David B. Dunson,et al.  Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images , 2012, IEEE Transactions on Image Processing.

[24]  Pierre Vandergheynst,et al.  Compressed Sensing and Redundant Dictionaries , 2007, IEEE Transactions on Information Theory.

[25]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.

[26]  Jean Ponce,et al.  Task-Driven Dictionary Learning , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  Chein-I Chang,et al.  Semi-Supervised Linear Spectral Unmixing Using a Hierarchical Bayesian Model for Hyperspectral Imagery , 2008, IEEE Transactions on Signal Processing.

[28]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[29]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[30]  A. Bruckstein,et al.  K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation , 2005 .

[31]  Michael Elad,et al.  Efficient Implementation of the K-SVD Algorithm using Batch Orthogonal Matching Pursuit , 2008 .

[32]  Xiaodong Wang,et al.  An lp-based reconstruction algorithm for compressed sensing radar imaging , 2016, 2016 IEEE Radar Conference (RadarConf).

[33]  Jean-Yves Tourneret,et al.  Bayesian Fusion of Multi-Band Images , 2013, IEEE Journal of Selected Topics in Signal Processing.

[34]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[35]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[36]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[37]  Yonina C. Eldar,et al.  Block-Sparse Signals: Uncertainty Relations and Efficient Recovery , 2009, IEEE Transactions on Signal Processing.

[38]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[39]  Yonina C. Eldar,et al.  Structured Compressed Sensing: From Theory to Applications , 2011, IEEE Transactions on Signal Processing.

[40]  Chandra R. Murthy,et al.  Cramér-Rao-Type Bounds for Sparse Bayesian Learning , 2012, IEEE Transactions on Signal Processing.

[41]  Bruno A. Olshausen,et al.  Learning Horizontal Connections in a Sparse Coding Model of Natural Images , 2007, NIPS.

[42]  Kenneth E. Barner,et al.  Iterative algorithms for compressed sensing with partially known support , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[43]  Geoffrey E. Hinton,et al.  Deep, Narrow Sigmoid Belief Networks Are Universal Approximators , 2008, Neural Computation.

[44]  Jean-Yves Tourneret,et al.  Joint Segmentation of Piecewise Constant Autoregressive Processes by Using a Hierarchical Model and a Bayesian Sampling Approach , 2006, IEEE Transactions on Signal Processing.