Dual hybrid manifold regularized non-negative matrix factorization with discriminability for image clustering

Matrix factorization techniques are wildly used in computer vision and data mining. Among them, Non-negative Matrix Factorization (NMF) has received considerable attention due to its part-based representation and sparsity. Recent research has shown that not only the data points are sampled from data manifold, but also the features are sampled from feature manifold. To exploit the duality between data space and feature space, researchers have proposed various dual graph regularized matrix. But some shortcomings are shared by the existing methods. 1) Using only one type of graph, k nearest neighbors (KNN) graph, to approximate the complicated manifold in data space and feature space. 2) Most existing methods ignore the discriminative information in image data. In this paper, we propose Dual Hybrid Manifold Regularized Non-negative Matrix Factorization with Discriminability (DHNMFD), a novel nonnegative representation learning algorithm for image clustering. On the one hand, KNN graph and sparse subspace clustering (SSC) based graph are linearly combined to maximally approximate the intrinsic manifold in data space and feature space. On the other hand, discriminative information by approximate orthogonal constraints is exploited to capture the discriminative information of data. We propose an iterative multiplicative updating rule for optimization of DHNMFD. Experiments on two image datasets demonstrated the superiority of DHNMFD compared with other state-of-the-art related methods.

[1]  Tao Li,et al.  The Relationships Among Various Nonnegative Matrix Factorization Methods for Clustering , 2006, Sixth International Conference on Data Mining (ICDM'06).

[2]  Fei Wang,et al.  Graph dual regularization non-negative matrix factorization for co-clustering , 2012, Pattern Recognit..

[3]  Jieping Ye,et al.  Discriminative K-means for Clustering , 2007, NIPS.

[4]  Quanquan Gu,et al.  Co-clustering on manifolds , 2009, KDD.

[5]  Chris H. Q. Ding,et al.  Spectral Relaxation for K-means Clustering , 2001, NIPS.

[6]  Mikhail Belkin,et al.  Manifold Regularization : A Geometric Framework for Learning from Examples , 2004 .

[7]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[8]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[9]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[10]  Inderjit S. Dhillon,et al.  Information-theoretic co-clustering , 2003, KDD '03.

[11]  R. Vidal,et al.  Sparse Subspace Clustering: Algorithm, Theory, and Applications. , 2013, IEEE transactions on pattern analysis and machine intelligence.

[12]  Chris H. Q. Ding,et al.  Orthogonal nonnegative matrix t-factorizations for clustering , 2006, KDD '06.

[13]  Inderjit S. Dhillon,et al.  Kernel k-means: spectral clustering and normalized cuts , 2004, KDD.

[14]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[15]  C. Ding,et al.  On the Equivalence of Nonnegative Matrix Factorization and K-means - Spectral Clustering , 2005 .

[16]  Jitendra Malik,et al.  Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[17]  Xiaojun Wu,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Inderjit S. Dhillon,et al.  Co-clustering documents and words using bipartite spectral graph partitioning , 2001, KDD '01.

[19]  Yi Yang,et al.  Image Clustering Using Local Discriminant Models and Global Integration , 2010, IEEE Transactions on Image Processing.

[20]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.