Partial Multi-Label Learning via Multi-Subspace Representation

Partial Multi-Label Learning (PML) aims to learn from the training data where each instance is associated with a set of candidate labels, among which only a part of them are relevant. Existing PML methods mainly focus on label disambiguation, while they lack the consideration of noise in feature space. To tackle the problem, we propose a novel framework named partial multilabel learning via MUlti-SubspacE Representation (MUSER), where the redundant labels together with noisy features are jointly taken into consideration during the training process. Specifically, we first decompose the original label space into a latent label subspace and a label correlation matrix to reduce the negative effects of redundant labels, then we utilize the correlations among features to map the original noisy feature space to a feature subspace to resist the noisy feature information. Afterwards, we introduce a graph Laplacian regularization to constrain the label subspace to keep intrinsic structure among features and impose an orthogonality constraint on the correlations among features to guarantee discriminability of the feature subspace. Extensive experiments conducted on various datasets demonstrate the superiority of our proposed method.

[1]  Jun Wang,et al.  Feature-Induced Partial Multi-label Learning , 2018, 2018 IEEE International Conference on Data Mining (ICDM).

[2]  Weiwei Liu,et al.  Making Decision Trees Feasible in Ultrahigh Feature and Label Dimensions , 2017, J. Mach. Learn. Res..

[3]  Zhi-Hua Zhou,et al.  Multi-Label Learning with Weak Label , 2010, AAAI.

[4]  David A. Forsyth,et al.  Object Recognition as Machine Translation: Learning a Lexicon for a Fixed Image Vocabulary , 2002, ECCV.

[5]  Weixiong Zhang,et al.  Marginalized Denoising for Link Prediction and Multi-Label Learning , 2015, AAAI.

[6]  Jiebo Luo,et al.  Learning multi-label scene classification , 2004, Pattern Recognit..

[7]  Tao Wang,et al.  GM-PLL: Graph Matching Based Partial Label Learning , 2019, IEEE Transactions on Knowledge and Data Engineering.

[8]  Zhi-Hua Zhou,et al.  ML-KNN: A lazy learning approach to multi-label learning , 2007, Pattern Recognit..

[9]  Min-Ling Zhang,et al.  Partial Multi-Label Learning via Credible Label Elicitation , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Tao Wang,et al.  Partial Multi-Label Learning by Low-Rank and Sparse Decomposition , 2019, AAAI.

[11]  Weiwei Liu,et al.  Metric Learning for Multi-Output Tasks , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Xuan Wu,et al.  Towards Enabling Binary Decomposition for Partial Label Learning , 2018, IJCAI.

[13]  Min-Ling Zhang,et al.  Disambiguation-Free Partial Label Learning , 2017, IEEE Transactions on Knowledge and Data Engineering.

[14]  Johannes Fürnkranz,et al.  Efficient Pairwise Multilabel Classification for Large-Scale Problems in the Legal Domain , 2008, ECML/PKDD.

[15]  Guojun Dai,et al.  HERA: Partial Label Learning by Combining Heterogeneous Loss with Sparse and Low-Rank Regularization , 2019, ACM Trans. Intell. Syst. Technol..

[16]  B. Ripley,et al.  Pattern Recognition , 1968, Nature.

[17]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .

[18]  Fei Yu,et al.  Solving the Partial Label Learning Problem: An Instance-Based Approach , 2015, IJCAI.