Hybrid Kronecker Product Decomposition and Approximation

Discovering the underlying low dimensional structure of high dimensional data has attracted a significant amount of researches recently and has shown to have a wide range of applications. As an effective dimension reduction tool, singular value decomposition is often used to analyze high dimensional matrices, which are traditionally assumed to have a low rank matrix approximation. In this paper, we propose a new approach. We assume a high dimensional matrix can be approximated by a sum of a small number of Kronecker products of matrices with potentially different configurations, named as a hybird Kronecker outer Product Approximation (hKoPA). It provides an extremely flexible way of dimension reduction compared to the low-rank matrix approximation. Challenges arise in estimating a hKoPA when the configurations of component Kronecker products are different or unknown. We propose an estimation procedure when the set of configurations are given and a joint configuration determination and component estimation procedure when the configurations are unknown. Specifically, a least squares backfitting algorithm is used when the configuration is given. When the configuration is unknown, an iterative greedy algorithm is used. Both simulation and real image examples show that the proposed algorithms have promising performances. The hybrid Kronecker product approximation may have potentially wider applications in low dimensional representation of high dimensional data

[1]  Ebru Arisoy,et al.  Low-rank matrix factorization for Deep Neural Network training with high-dimensional output targets , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[2]  Daniel Kressner,et al.  A literature survey of low‐rank tensor approximation techniques , 2013, 1302.7121.

[3]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .

[4]  Dit-Yan Yeung,et al.  Overlapping community detection via bounded nonnegative matrix tri-factorization , 2012, KDD.

[5]  Yuan Yan Tang,et al.  Topology Preserving Non-negative Matrix Factorization for Face Recognition , 2008, IEEE Transactions on Image Processing.

[6]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[7]  C. Eckart,et al.  The approximation of one matrix by another of lower rank , 1936 .

[8]  C. Loan,et al.  Approximation with Kronecker Products , 1992 .

[9]  Michael W. Berry,et al.  Text Mining Using Non-Negative Matrix Factorizations , 2004, SDM.

[10]  Tom Minka,et al.  Automatic Choice of Dimensionality for PCA , 2000, NIPS.

[12]  Inderjit S. Dhillon,et al.  Temporal Regularized Matrix Factorization for High-dimensional Time Series Prediction , 2016, NIPS.

[13]  Kwok Pui Choi,et al.  Consistency of AIC and BIC in estimating the number of significant components in high-dimensional principal component analysis , 2018, The Annals of Statistics.

[14]  Jordi Vitrià,et al.  Non-negative Matrix Factorization for Face Recognition , 2002, CCIA.

[15]  Clifford Lam,et al.  Factor modeling for high-dimensional time series: inference for the number of factors , 2012, 1206.0613.

[16]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[17]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[18]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[19]  Ming Yuan,et al.  On Tensor Completion via Nuclear Norm Minimization , 2014, Foundations of Computational Mathematics.

[20]  Yoav Freund,et al.  A Short Introduction to Boosting , 1999 .

[21]  Paul Geladi,et al.  Principal Component Analysis , 1987, Comprehensive Chemometrics.

[22]  Hujun Bao,et al.  Understanding the Power of Clause Learning , 2009, IJCAI.

[23]  Raymond Laflamme,et al.  An Introduction to Quantum Computing , 2007, Quantum Inf. Comput..

[24]  Jure Leskovec,et al.  Overlapping community detection at scale: a nonnegative matrix factorization approach , 2013, WSDM.

[25]  Patrik O. Hoyer,et al.  Non-negative Matrix Factorization with Sparseness Constraints , 2004, J. Mach. Learn. Res..

[26]  Petre Stoica,et al.  On Estimation of Covariance Matrices With Kronecker Product Structure , 2008, IEEE Transactions on Signal Processing.

[27]  Can M. Le,et al.  Optimization via Low-rank Approximation for Community Detection in Networks , 2014 .

[28]  Richard G. Baraniuk,et al.  Kronecker Compressive Sensing , 2012, IEEE Transactions on Image Processing.

[29]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.