One-step Multi-view Clustering with Diverse Representation

Multi-view clustering has attracted broad attention due to its capacity to utilize consistent and complementary information among views. Although tremendous progress has been made recently, most existing methods undergo high complexity, preventing them from being applied to large-scale tasks. Multi-view clustering via matrix factorization is a representative to address this issue. However, most of them map the data matrices into a fixed dimension, limiting the model's expressiveness. Moreover, a range of methods suffers from a two-step process, i.e., multimodal learning and the subsequent $k$-means, inevitably causing a sub-optimal clustering result. In light of this, we propose a one-step multi-view clustering with diverse representation method, which incorporates multi-view learning and $k$-means into a unified framework. Specifically, we first project original data matrices into various latent spaces to attain comprehensive information and auto-weight them in a self-supervised manner. Then we directly use the information matrices under diverse dimensions to obtain consensus discrete clustering labels. The unified work of representation learning and clustering boosts the quality of the final results. Furthermore, we develop an efficient optimization algorithm with proven convergence to solve the resultant problem. Comprehensive experiments on various datasets demonstrate the promising clustering performance of our proposed method.

[1]  Zenglin Xu,et al.  Multi-View Subspace Clustering by Joint Measuring of Consistency and Diversity , 2023, IEEE Transactions on Knowledge and Data Engineering.

[2]  J. Wang,et al.  Unified One-Step Multi-View Spectral Clustering , 2023, IEEE Transactions on Knowledge and Data Engineering.

[3]  Zhe Liu,et al.  Auto-weighted Multi-view Clustering for Large-scale Data , 2023, AAAI.

[4]  Xinwang Liu,et al.  Continual Multi-view Clustering , 2022, ACM Multimedia.

[5]  Siwei Wang,et al.  Multiple Kernel Clustering with Dual Noise Minimization , 2022, ACM Multimedia.

[6]  Kuan-Ching Li,et al.  Local Sample-weighted Multiple Kernel Clustering with Consensus Discriminative Graph , 2022, IEEE transactions on neural networks and learning systems.

[7]  Suyuan Liu,et al.  Efficient One-Pass Multi-View Subspace Clustering with Consensus Anchors , 2022, AAAI.

[8]  Siwei Wang,et al.  Simple Contrastive Graph Clustering , 2022, IEEE transactions on neural networks and learning systems.

[9]  Xianbin Wen,et al.  Multiview clustering via consistent and specific nonnegative matrix factorization with graph regularization , 2022, Multimedia Systems.

[10]  Feiping Nie,et al.  Multi-View K-Means Clustering With Adaptive Sparse Memberships and Weight Allocation , 2022, IEEE Transactions on Knowledge and Data Engineering.

[11]  Xuelong Li,et al.  Multiview Clustering: A Scalable and Parameter-Free Bipartite Graph Fusion Method , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  En Zhu,et al.  Deep Graph Clustering via Dual Correlation Reduction , 2021, AAAI.

[13]  Xinzhong Zhu,et al.  Fast Parameter-Free Multi-View Subspace Clustering With Consensus Anchor Guidance , 2021, IEEE Transactions on Image Processing.

[14]  En Zhu,et al.  Scalable Multi-view Subspace Clustering with Unified Anchors , 2021, ACM Multimedia.

[15]  Xinwang Liu,et al.  One-pass Multi-view Clustering for Large-scale Data , 2021, 2021 IEEE/CVF International Conference on Computer Vision (ICCV).

[16]  C. Ding,et al.  Tensorized Bipartite Graph Learning for Multi-View Clustering , 2021, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Xinwang Liu,et al.  Multiview Subspace Clustering via Co-Training Robust Data Representation , 2021, IEEE Transactions on Neural Networks and Learning Systems.

[18]  Xifeng Guo,et al.  Anchor-Based Multiview Subspace Clustering With Diversity Regularization , 2020, IEEE MultiMedia.

[19]  Jian Xiong,et al.  Multi-View Spectral Clustering With High-Order Optimal Neighborhood Laplacian Matrix , 2020, IEEE Transactions on Knowledge and Data Engineering.

[20]  Xiong-Lin Luo,et al.  Multi-View Non-negative Matrix Factorization Discriminant Learning via Cross Entropy Loss , 2020, 2020 Chinese Control And Decision Conference (CCDC).

[21]  Chang Tang,et al.  CGD: Multi-View Clustering via Cross-View Graph Diffusion , 2020, AAAI.

[22]  Zenglin Xu,et al.  Large-scale Multi-view Subspace Clustering in Linear Time , 2019, AAAI.

[23]  Changqing Zhang,et al.  Flexible Multi-View Representation Learning for Subspace Clustering , 2019, IJCAI.

[24]  En Zhu,et al.  Multi-view Clustering via Late Fusion Alignment Maximization , 2019, IJCAI.

[25]  Ming Yin,et al.  Multi-view low-rank matrix factorization using multiple manifold regularization , 2019, Neurocomputing.

[26]  Hongchuan Yu,et al.  Diverse Non-Negative Matrix Factorization for Multiview Data Representation , 2018, IEEE Transactions on Cybernetics.

[27]  Lei Wang,et al.  Multiple Kernel Clustering with Local Kernel Alignment Maximization , 2016, IJCAI.

[28]  Xuelong Li,et al.  Multi-view Subspace Clustering , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[29]  Chris H. Q. Ding,et al.  Orthogonal nonnegative matrix t-factorizations for clustering , 2006, KDD '06.

[30]  Harold W. Kuhn,et al.  The Hungarian method for the assignment problem , 1955, 50 Years of Integer Programming.

[31]  Jiawei Han,et al.  Multi-View Clustering via Joint Nonnegative Matrix Factorization , 2013, SDM.

[32]  S. Dasgupta The hardness of k-means clustering , 2008 .