Nonnegatively Constrained Tensor Network for Classification Problems

In the era of big data, massive data are multimodal and highly dimensional, and their processing and analysis is often performed in distributed computing systems, including cloud computing, fog computing as well as ubiquitous network computing. However, a curse of dimensionality strongly limits the usage of fundamental machine learning methods, and hence new state-of-the art models for multi-way data are needed. In this study, we discuss a possibility of using one specific topology of tensor networks for solving multi-label classification problems, under the assumption that the data are gathered in the form of nonnegatively constrained multi-way arrays, and can be represented by a tensor-train model. We propose a new computational algorithm for extracting low-rank and nonnegative 2D features, and show how to apply this algorithm for solving classification problems. The computational experiments demonstrated that our approach outperforms the fundamental and state-of-the art methods for dimensionality reduction and classification problems.

[1]  Östlund,et al.  Thermodynamic limit of density matrix renormalization. , 1995, Physical review letters.

[2]  C. F. Roos,et al.  Efficient tomography of a quantum many-body system , 2016, Nature Physics.

[3]  Rafal Zdunek,et al.  Distributed and Randomized Tensor Train Decomposition for Feature Extraction , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[4]  Minh N. Do,et al.  Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train , 2016, IEEE Transactions on Image Processing.

[5]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[6]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[7]  Bin Liu,et al.  Medium Access Control for Wireless Body Area Networks with QoS Provisioning and Energy Efficient Design , 2017, IEEE Transactions on Mobile Computing.

[8]  Tamir Hazan,et al.  Non-negative tensor factorization with applications to statistics and computer vision , 2005, ICML.

[9]  White,et al.  Density matrix formulation for quantum renormalization groups. , 1992, Physical review letters.

[10]  Fengyu Cong,et al.  Nonnegative Tensor Train Decompositions for Multi-domain Feature Extraction and Clustering , 2016, ICONIP.

[11]  Andrzej Cichocki,et al.  Nonnegative Matrix and Tensor Factorization T , 2007 .

[12]  Ivan V. Oseledets,et al.  Approximation of 2d˟2d Matrices Using Tensor Decomposition , 2010, SIAM J. Matrix Anal. Appl..

[13]  G. Vidal,et al.  Classical simulation of quantum many-body systems with a tree tensor network , 2005, quant-ph/0511070.

[14]  Boris N. Khoromskij,et al.  Tensor Numerical Methods in Quantum Chemistry , 2018 .

[15]  G. Vidal Entanglement renormalization. , 2005, Physical review letters.

[16]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..

[17]  Lars Grasedyck,et al.  Stable ALS approximation in the TT-format for rank-adaptive tensor completion , 2019, Numerische Mathematik.

[18]  Vladimir A. Kazeev,et al.  Direct Solution of the Chemical Master Equation Using Quantized Tensor Trains , 2014, PLoS Comput. Biol..

[19]  Vaneet Aggarwal,et al.  Tensor Train Neighborhood Preserving Embedding , 2017, IEEE Transactions on Signal Processing.

[20]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[21]  Md. Iftekhar Hussain,et al.  A comparison of 802.11ah and 802.15.4 for IoT , 2016, ICT Express.

[22]  Andrzej Cichocki,et al.  Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization , 2007, ICA.

[23]  F. Verstraete,et al.  Matrix product density operators: simulation of finite-temperature and dissipative systems. , 2004, Physical review letters.

[24]  Andrzej Cichocki,et al.  Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions , 2014, ArXiv.

[25]  T. Schulte-Herbrüggen,et al.  Computations in quantum tensor networks , 2012, 1212.5005.

[26]  K. Wilson The renormalization group: Critical phenomena and the Kondo problem , 1975 .

[27]  F. Verstraete,et al.  Criticality, the area law, and the computational power of projected entangled pair states. , 2006, Physical review letters.

[28]  Ngai Wong,et al.  Fast and Accurate Tensor Completion with Tensor Trains: A System Identification Approach , 2018, ArXiv.

[29]  Andrzej Cichocki,et al.  Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition , 2016, ArXiv.

[30]  Rafal Zdunek,et al.  Distributed geometric nonnegative matrix factorization and hierarchical alternating least squares–based nonnegative tensor factorization with the MapReduce paradigm , 2018, Concurr. Comput. Pract. Exp..

[31]  Ernesto Estrada,et al.  A First Course in Network Theory , 2015 .

[32]  Albert-László Barabási,et al.  Evolution of Networks: From Biological Nets to the Internet and WWW , 2004 .

[33]  Yu-Jin Zhang,et al.  Nonnegative Matrix Factorization: A Comprehensive Review , 2013, IEEE Transactions on Knowledge and Data Engineering.