Identification of kronecker-structured dictionaries: An asymptotic analysis

The focus of this work is on derivation of conditions for asymptotic recovery of Kronecker-structured dictionaries underlying second-order tensor data. Given second-order tensor observations (equivalently, matrix-valued data samples) that are generated using a Kronecker-structured dictionary and sparse coefficient tensors, conditions on the dictionary and coefficient distribution are derived that enable asymptotic recovery of the individual coordinate dictionaries comprising the Kronecker dictionary within a local neighborhood of the true model. These conditions constitute the first step towards understanding the sample complexity of Kronecker-structured dictionary learning for second- and higher-order tensor data.

[1]  Joseph F. Murray,et al.  Dictionary Learning Algorithms for Sparse Representation , 2003, Neural Computation.

[2]  Chein-I Chang,et al.  Hyperspectral image classification and dimensionality reduction: an orthogonal subspace projection approach , 1994, IEEE Trans. Geosci. Remote. Sens..

[3]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[4]  Anand D. Sarwate,et al.  Sample complexity bounds for dictionary learning of tensor data , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[5]  Syed Zubair,et al.  Tensor dictionary learning with sparse TUCKER decomposition , 2013, 2013 18th International Conference on Digital Signal Processing (DSP).

[6]  Michael Elad,et al.  Compression of facial images using the K-SVD algorithm , 2008, J. Vis. Commun. Image Represent..

[7]  Lloyd R. Welch,et al.  Lower bounds on the maximum cross correlation of signals (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[8]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.

[9]  Martin Kleinsteuber,et al.  Separable Dictionary Learning , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Cássio Fraga Dantas,et al.  Learning Dictionaries as a Sum of Kronecker Products , 2017, IEEE Signal Processing Letters.

[11]  Florian Roemer,et al.  Tensor-based algorithms for learning multidimensional separable dictionaries , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[12]  A. Bruckstein,et al.  K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation , 2005 .

[13]  Andrzej Cichocki,et al.  Computing Sparse Representations of Multidimensional Signals Using Kronecker Bases , 2013, Neural Computation.

[14]  Anand D. Sarwate,et al.  Minimax Lower Bounds on Dictionary Learning for Tensor Data , 2016, IEEE Transactions on Information Theory.

[15]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[16]  Andrzej Cichocki,et al.  Multidimensional compressed sensing and their applications , 2013, WIREs Data Mining Knowl. Discov..

[17]  Yonina C. Eldar,et al.  On the Minimax Risk of Dictionary Learning , 2015, IEEE Transactions on Information Theory.

[18]  Anand D. Sarwate,et al.  Minimax lower bounds for Kronecker-structured dictionary learning , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[19]  C. Loan The ubiquitous Kronecker product , 2000 .

[20]  Rémi Gribonval,et al.  Sparse and Spurious: Dictionary Learning With Noise and Outliers , 2014, IEEE Transactions on Information Theory.