Low Rank Activations for Tensor-Based Convolutional Sparse Coding

In this article, we propose to extend the classical Convolutional Sparse Coding model (CSC) to multivariate data by introducing a new tensor CSC model that enforces sparsity and low-rank constraint on the activations. The advantages of this model are threefold. First, by using tensor algebra, this model takes into account the underlying structure of the data. Second, this model allows for complex atoms but enforces fewer activations to decompose the data, resulting in an improved summary (dictionary) and a better reconstruction of the original multivariate signal. Third, the number of parameters to be estimated are greatly reduced by the low-rank constraint. We exhibit the associated optimization problem and propose a framework based on alternating optimization to solve it. Finally, we evaluate it on both synthetic and real data.

[1]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[2]  Hongtu Zhu,et al.  Tensor Regression with Applications in Neuroimaging Data Analysis , 2012, Journal of the American Statistical Association.

[3]  Brendt Wohlberg,et al.  SPORCO: A Python package for standard and convolutional sparse representations , 2017, SciPy.

[4]  René Vidal,et al.  Blood cell detection and counting in holographic lens-free imaging by convolutional sparse dictionary learning and coding , 2017, 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017).

[5]  Guillermo Sapiro,et al.  Online Learning for Matrix Factorization and Sparse Coding , 2009, J. Mach. Learn. Res..

[6]  Vincent Lepetit,et al.  Learning Separable Filters , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Daniel M. Dunlavy,et al.  A scalable optimization approach for fitting canonical tensor decompositions , 2011 .

[8]  Brendt Wohlberg,et al.  Efficient Algorithms for Convolutional Sparse Representations , 2016, IEEE Transactions on Image Processing.

[9]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.

[10]  J. Kruskal Rank, decomposition, and uniqueness for 3-way and n -way arrays , 1989 .

[11]  Ke Huang,et al.  Sparse Representation for Signal Classification , 2006, NIPS.

[12]  P. Paatero A weighted non-negative least squares algorithm for three-way ‘PARAFAC’ factor analysis , 1997 .

[13]  José Carlos Príncipe,et al.  A fast proximal method for convolutional sparse coding , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[14]  Brendt Wohlberg,et al.  Convolutional Dictionary Learning: A Comparative Review and New Algorithms , 2017, IEEE Transactions on Computational Imaging.

[15]  Maja Pantic,et al.  TensorLy: Tensor Learning in Python , 2016, J. Mach. Learn. Res..

[16]  Brendt Wohlberg,et al.  Convolutional sparse representation of color images , 2016, 2016 IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI).

[17]  Guillermo Sapiro,et al.  Supervised Dictionary Learning , 2008, NIPS.