Learning Overcomplete Signal Sparsifying Transforms

I. TRANSFORM LEARNING The formulations for learning synthesis [1] and analysis [2], [3] sparsifying dictionaries are typically non-convex and NP-hard, and the approximate algorithms are still computationally expensive. As an alternative, we recently introduced an approach for learning square sparsifying transforms W ∈ Rm×n, m = n [4], which are competitive with overcomplete synthesis or analysis dictionaries in image denoising, at a fraction of the computational cost. In this work, we extend the learning to the overcomplete case, i.e., m > n. The classical transform model for signal y is Wy = x + e, where W ∈ Rm×n is a sparsifying transform, x ∈ R is sparse (∥x∥0 ≪ m), and e is a small residual in the transform domain [4]. Given a matrix Y ∈ Rn×N whose columns are training signals, our formulation [4] for learning a square transform W ∈ Rn×n is

[1]  Yoram Bresler,et al.  Learning overcomplete sparsifying transforms for signal processing , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[2]  Yoram Bresler,et al.  Learning Sparsifying Transforms , 2013, IEEE Transactions on Signal Processing.

[3]  Rémi Gribonval,et al.  Noise aware analysis operator learning for approximately cosparse signals , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[4]  Michael Elad,et al.  Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries , 2006, IEEE Transactions on Image Processing.

[5]  Michael Elad,et al.  K-SVD dictionary-learning for the analysis sparse model , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[6]  A. Bruckstein,et al.  K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation , 2005 .