A novel extension of Generalized Low-Rank Approximation of Matrices based on multiple-pairs of transformations

Dimensionality reduction is a main step in the learning process which plays an essential role in many applications. The most popular methods in this field like SVD, PCA, and LDA, only can be applied to data with vector format. This means that for higher order data like matrices or more generally tensors, data should be fold to the vector format. So, in this approach, the spatial relations of features are not considered and also the probability of over-fitting is increased. Due to these issues, in recent years some methods like Generalized low-rank approximation of matrices (GLRAM) and Multilinear PCA (MPCA) are proposed which deal with the data in their own format. So, in these methods, the spatial relationships of features are preserved and the probability of overfitting could be fallen. Also, their time and space complexities are less than vector-based ones. However, because of the fewer parameters, the search space in a multilinear approach is much smaller than the search space of the vector-based approach. To overcome this drawback of multilinear methods like GLRAM, we proposed a new method which is a general form of GLRAM and by preserving the merits of it have a larger search space. Experimental results confirm the quality of the proposed method. Also, applying this approach to the other multilinear dimensionality reduction methods like MPCA and MLDA is straightforward.

[1]  Konstantinos N. Plataniotis,et al.  A Taxonomy of Emerging Multilinear Discriminant Analysis Solutions for Biometric Signal Recognition , 2010 .

[2]  Xuelong Li,et al.  Supervised tensor learning , 2005, Fifth IEEE International Conference on Data Mining (ICDM'05).

[3]  Kochetov Vadim,et al.  Overview of different approaches to solving problems of Data Mining , 2017, BICA.

[4]  Haiping Lu,et al.  MPCA: Multilinear Principal Component Analysis of Tensor Objects , 2008, IEEE Transactions on Neural Networks.

[5]  Gregory Piatetsky-Shapiro,et al.  High-Dimensional Data Analysis: The Curses and Blessings of Dimensionality , 2000 .

[6]  Jieping Ye,et al.  LDA/QR: an efficient and effective dimension reduction algorithm and its theoretical foundation , 2004, Pattern Recognit..

[7]  Feiping Nie,et al.  Multiple rank multi-linear SVM for matrix data classification , 2014, Pattern Recognit..

[8]  Scott E. Umbaugh,et al.  Digital image processing and analysis : human and computer vision applications with CVIPtools , 2011 .

[9]  Thomas Wiatowski,et al.  A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction , 2015, IEEE Transactions on Information Theory.

[10]  Seyed Mohammad Hosseini,et al.  Best Kronecker Product Approximation of The Blurring Operator in Three Dimensional Image Restoration Problems , 2014, SIAM J. Matrix Anal. Appl..

[11]  Wenjie Zhang,et al.  On the flexibility of block coordinate descent for large-scale optimization , 2018, Neurocomputing.

[12]  Haiping Lu,et al.  A survey of multilinear subspace learning for tensor data , 2011, Pattern Recognit..

[13]  Andreas Christmann,et al.  Support vector machines , 2008, Data Mining and Knowledge Discovery Handbook.

[14]  C. Loan,et al.  Approximation with Kronecker Products , 1992 .

[15]  David A. Landgrebe,et al.  Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data , 1998, IEEE Trans. Syst. Man Cybern. Part C.

[16]  Yuxiao Hu,et al.  Learning a Spatially Smooth Subspace for Face Recognition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Alia Ghaddar,et al.  Algorithm for data similarity measurements to reduce data redundancy in wireless sensor networks , 2010, 2010 IEEE International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM).

[18]  Jieping Ye,et al.  Generalized Low Rank Approximations of Matrices , 2004, Machine Learning.

[19]  Charles L. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[20]  Feiping Nie,et al.  Extracting the optimal dimensionality for local tensor discriminant analysis , 2009, Pattern Recognit..

[21]  A. Laub,et al.  The singular value decomposition: Its computation and some applications , 1980 .

[22]  Dao-Qing Dai,et al.  Bilinear Lanczos components for fast dimensionality reduction and feature extraction , 2010, Pattern Recognit..

[23]  Jon Atli Benediktsson,et al.  Support Tensor Machines for Classification of Hyperspectral Remote Sensing Imagery , 2016, IEEE Transactions on Geoscience and Remote Sensing.

[24]  Michael I. Jordan,et al.  Machine learning: Trends, perspectives, and prospects , 2015, Science.

[25]  Nasser M. Nasrabadi,et al.  Pattern Recognition and Machine Learning , 2006, Technometrics.

[26]  Guangming Shi,et al.  Denoising Prior Driven Deep Neural Network for Image Restoration , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  Azriel Rosenfeld,et al.  Face recognition: A literature survey , 2003, CSUR.

[28]  Gianpaolo Francesco Trotta,et al.  Computer vision and deep learning techniques for pedestrian detection and tracking: A survey , 2018, Neurocomputing.

[29]  Dinggang Shen,et al.  Machine Learning in Medical Imaging , 2012, Lecture Notes in Computer Science.

[30]  Trevor Hastie,et al.  Regularized linear discriminant analysis and its application in microarrays. , 2007, Biostatistics.

[31]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[32]  Å. Björck Numerical Methods in Matrix Computations , 2014 .