Multi-view Discriminative Learning via Joint Non-negative Matrix Factorization

Multi-view learning attempts to generate a classifier with a better performance by exploiting relationship among multiple views. Existing approaches often focus on learning the consistency and/or complementarity among different views. However, not all consistent or complementary information is useful for learning, instead, only class-specific discriminative information is essential. In this paper, we propose a new robust multi-view learning algorithm, called DICS, by exploring the Discriminative and non-discriminative Information existing in Common and view-Specific parts among different views via joint non-negative matrix factorization. The basic idea is to learn a latent common subspace and view-specific subspaces, and more importantly, discriminative and non-discriminative information from all subspaces are further extracted to support a better classification. Empirical extensive experiments on seven real-world data sets have demonstrated the effectiveness of DICS, and show its superiority over many state-of-the-art algorithms.

[1]  Jianping Fan,et al.  Multi-View Concept Learning for Data Representation , 2015, IEEE Transactions on Knowledge and Data Engineering.

[2]  Lianli Gao,et al.  Common and distinct changes of default mode and salience network in schizophrenia and major depression , 2018, Brain Imaging and Behavior.

[3]  Zhi-Hua Zhou,et al.  A New Analysis of Co-Training , 2010, ICML.

[4]  Sham M. Kakade,et al.  Multi-view clustering via canonical correlation analysis , 2009, ICML '09.

[5]  Ethem Alpaydin,et al.  Multiple Kernel Learning Algorithms , 2011, J. Mach. Learn. Res..

[6]  C. Sorg,et al.  Prediction of Alzheimer's disease using individual structural connectivity networks , 2012, Neurobiology of Aging.

[7]  Svetha Venkatesh,et al.  Nonnegative shared subspace learning and its application to social media retrieval , 2010, KDD.

[8]  Hao Wang,et al.  Multi-view Clustering via Concept Factorization with Local Manifold Regularization , 2016, 2016 IEEE 16th International Conference on Data Mining (ICDM).

[9]  Wei Han,et al.  Exploring Common and Distinct Structural Connectivity Patterns Between Schizophrenia and Major Depression via Cluster-Driven Nonnegative Matrix Factorization , 2017, 2017 IEEE International Conference on Data Mining (ICDM).

[10]  David W. Jacobs,et al.  Generalized Multiview Analysis: A discriminative latent space , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Hal Daumé,et al.  Co-regularized Multi-view Spectral Clustering , 2011, NIPS.

[12]  Jiawei Han,et al.  Multi-View Clustering via Joint Nonnegative Matrix Factorization , 2013, SDM.

[13]  V. D. Sa Spectral Clustering with Two Views , 2007 .

[14]  R. Plemmons,et al.  Optimality, computation, and interpretation of nonnegative matrix factorizations , 2004 .

[15]  Jing Liu,et al.  Partially Shared Latent Factor Learning With Multiview Data , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Avrim Blum,et al.  The Bottleneck , 2021, Monopsony Capitalism.

[17]  Junming Shao,et al.  Discovering Aberrant Patterns of Human Connectome in Alzheimer's Disease via Subgraph Mining , 2012, 2012 IEEE 12th International Conference on Data Mining Workshops.

[18]  Shiguang Shan,et al.  Multi-View Discriminant Analysis , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Seungjin Choi,et al.  Semi-Supervised Nonnegative Matrix Factorization , 2010, IEEE Signal Processing Letters.

[20]  Zhi-Hua Zhou,et al.  CoTrade: Confident Co-Training With Data Editing. , 2011, IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics : a publication of the IEEE Systems, Man, and Cybernetics Society.

[21]  Svetha Venkatesh,et al.  Regularized nonnegative shared subspace learning , 2011, Data Mining and Knowledge Discovery.

[22]  Xuelong Li,et al.  Parameter-Free Auto-Weighted Multiple Graph Learning: A Framework for Multiview Clustering and Semi-Supervised Classification , 2016, IJCAI.

[23]  Hal Daumé,et al.  A Co-training Approach for Multi-view Spectral Clustering , 2011, ICML.

[24]  Sebastian Thrun,et al.  Text Classification from Labeled and Unlabeled Documents using EM , 2000, Machine Learning.

[25]  John Shawe-Taylor,et al.  Two view learning: SVM-2K, Theory and Practice , 2005, NIPS.

[26]  Jaegul Choo,et al.  Simultaneous Discovery of Common and Discriminative Topics via Joint Nonnegative Matrix Factorization , 2015, KDD.

[27]  Haesun Park,et al.  Algorithms for nonnegative matrix and tensor factorizations: a unified view based on block coordinate descent framework , 2014, J. Glob. Optim..

[28]  Zhi-Hua Zhou,et al.  Rank Consistency based Multi-View Learning: A Privacy-Preserving Approach , 2015, CIKM.

[29]  John Shawe-Taylor,et al.  Canonical Correlation Analysis: An Overview with Application to Learning Methods , 2004, Neural Computation.

[30]  Dacheng Tao,et al.  A Survey on Multi-view Learning , 2013, ArXiv.

[31]  Xiaojun Wu,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[32]  Christopher J. C. Burges,et al.  Spectral clustering and transductive learning with multiple views , 2007, ICML '07.

[33]  Yongdong Zhang,et al.  Multiview Spectral Embedding , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).