Fast Hypergraph Regularized Nonnegative Tensor Ring Factorization Based on Low-Rank Approximation

For the high dimensional data representation, nonnegative tensor ring (NTR) decomposition equipped with manifold learning has become a promising model to exploit the multi-dimensional structure and extract the feature from tensor data. However, the existing methods such as graph regularized tensor ring decomposition (GNTR) only models the pair-wise similarities of objects. For tensor data with complex manifold structure, the graph can not exactly construct similarity relationships. In this paper, in order to effectively utilize the higher-dimensional and complicated similarities among objects, we introduce hypergraph to the framework of NTR to further enhance the feature extraction, upon which a hypergraph regularized nonnegative tensor ring decomposition (HGNTR) method is developed. To reduce the computational complexity and suppress the noise, we apply the low-rank approximation trick to accelerate HGNTR (called LraHGNTR). Our experimental results show that compared with other stateof-the-art algorithms, the proposed HGNTR and LraHGNTR can achieve higher performance in clustering tasks, in addition, LraHGNTR can greatly reduce running time without decreasing accuracy.

[1]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[2]  Zenglin Xu,et al.  Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition , 2018, AAAI.

[3]  TaeHyun Hwang,et al.  A hypergraph-based learning algorithm for classifying gene expression and arrayCGH data with prior knowledge , 2009, Bioinform..

[4]  Alain Bretto,et al.  Hypergraph Theory: An Introduction , 2013 .

[5]  Tamir Hazan,et al.  Sparse image coding using a 3D non-negative tensor factorization , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[6]  Qingshan Liu,et al.  Image retrieval via probabilistic hypergraph ranking , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Xiaojun Wu,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[9]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[10]  Andrzej Cichocki,et al.  Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation , 2012, IEEE Transactions on Signal Processing.

[11]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[12]  Jun Yu,et al.  Multi-view hypergraph learning by patch alignment framework , 2013, Neurocomputing.

[13]  Chao Li,et al.  Randomized Tensor Ring Decomposition and Its Application to Large-scale Data Reconstruction , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[14]  Masashi Sugiyama,et al.  Learning Efficient Tensor Representations with Ring-structured Networks , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[15]  Yihong Gong,et al.  Unsupervised Image Categorization by Hypergraph Partition , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Minh N. Do,et al.  Matrix Product State for Higher-Order Tensor Compression and Classification , 2016, IEEE Transactions on Signal Processing.

[17]  Sheng Huang,et al.  Improved hypergraph regularized Nonnegative Matrix Factorization with sparse representation , 2018, Pattern Recognit. Lett..

[18]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[19]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[20]  Andrzej Cichocki,et al.  Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness , 2014, IEEE Transactions on Image Processing.

[21]  C. Chui,et al.  Article in Press Applied and Computational Harmonic Analysis a Randomized Algorithm for the Decomposition of Matrices , 2022 .

[22]  Serge J. Belongie,et al.  Higher order learning with graphs , 2006, ICML.

[23]  V. Aggarwal,et al.  Efficient Low Rank Tensor Ring Completion , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[24]  Naoto Yokoya,et al.  Nonlocal Tensor-Ring Decomposition for Hyperspectral Image Denoising , 2020, IEEE Transactions on Geoscience and Remote Sensing.

[25]  Meng Wang,et al.  Adaptive Hypergraph Learning and its Application in Image Classification , 2012, IEEE Transactions on Image Processing.

[26]  Bernhard Schölkopf,et al.  Learning with Hypergraphs: Clustering, Classification, and Embedding , 2006, NIPS.

[27]  Jun Yu,et al.  High-level attributes modeling for indoor scenes classification , 2013, Neurocomputing.

[28]  Chao Li,et al.  Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion , 2018, AAAI.

[29]  Petros Drineas,et al.  FAST MONTE CARLO ALGORITHMS FOR MATRICES II: COMPUTING A LOW-RANK APPROXIMATION TO A MATRIX∗ , 2004 .

[30]  Seungjin Choi,et al.  Nonnegative Tucker Decomposition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[31]  K. Meerbergen,et al.  On the truncated multilinear singular value decomposition , 2011 .

[32]  A. Cichocki,et al.  Generalizing the column–row matrix decomposition to multi-way arrays , 2010 .

[33]  Anna-Lan Huang,et al.  Similarity Measures for Text Document Clustering , 2008 .

[34]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[35]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.