Multilevel dictionary learning for sparse representation of images

Adaptive data-driven dictionaries for sparse approximations provide superior performance compared to predefined dictionaries in applications involving representation and classification of data. In this paper, we propose a novel algorithm for learning global dictionaries particularly suited to the sparse representation of natural images. The proposed algorithm uses a hierarchical energy based learning approach to learn a multilevel dictionary. The atoms that contribute the most energy to the representation are learned in the first level and those that contribute lesser energies are learned in the subsequent levels. The learned multilevel dictionary is compared to a dictionary learned using the K-SVD algorithm. Reconstruction results using a small number of non-zero coefficients demonstrate the advantage of exploiting energy hierarchy using multilevel dictionaries, pointing to potential applications in low bit-rate image compression. Superior performance in compressed sensing using optimized sensing matrices with small number of measurements is also demonstrated.

[1]  Joseph F. Murray,et al.  Dictionary Learning Algorithms for Sparse Representation , 2003, Neural Computation.

[2]  Shai Ben-David,et al.  Stability of k -Means Clustering , 2007, COLT.

[3]  Song-Chun Zhu,et al.  Learning explicit and implicit visual manifolds by information projection , 2010, Pattern Recognit. Lett..

[4]  A. Bruckstein,et al.  K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation , 2005 .

[5]  Bhaskar D. Rao,et al.  Signal processing with the sparseness constraint , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[6]  D J Field,et al.  Relations between the statistics of natural images and the response properties of cortical cells. , 1987, Journal of the Optical Society of America. A, Optics and image science.

[7]  Alexander Rakhlin,et al.  Stability Properties of Empirical Risk Minimization over Donsker Classes , 2006, J. Mach. Learn. Res..

[8]  Yuanqing Li,et al.  K-hyperline clustering learning for sparse component analysis , 2009, Signal Process..

[9]  David J. Field,et al.  What Is the Goal of Sensory Coding? , 1994, Neural Computation.

[10]  Joachim M. Buhmann,et al.  Grosser Systeme Echtzeitoptimierung Schwerpunktprogramm Der Deutschen Forschungsgemeinschaft Empirical Risk Approximation: an Induction Principle for Unsupervised Learning , 2022 .

[11]  Karthikeyan Natesan Ramamurthy,et al.  Optimality and stability of the K-hyperline clustering algorithm , 2011, Pattern Recognit. Lett..

[12]  Jayaraman J. Thiagarajan,et al.  Shift-invariant sparse representation of images using learned dictionaries , 2008, 2008 IEEE Workshop on Machine Learning for Signal Processing.

[13]  Michael Elad,et al.  Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries , 2006, IEEE Transactions on Image Processing.

[14]  Guillermo Sapiro,et al.  Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization , 2009, IEEE Transactions on Image Processing.

[15]  Karthikeyan Natesan Ramamurthy,et al.  Sparse Representations for Pattern Classification using Learned Dictionaries , 2008, SGAI Conf..

[16]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[17]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.