Multi-label feature selection with missing labels

Abstract The consistently increasing of the feature dimension brings about great time complexity and storage burden for multi-label learning. Numerous multi-label feature selection techniques are developed to alleviate the effect of high-dimensionality. The existing multi-label feature selection algorithms assume that the labels of the training data are complete. However, this assumption does not always hold true for labeling data is costly and there is ambiguity among classes. Hence, in real-world applications, the data available usually have an incomplete set of labels. In this paper, we present a novel multi-label feature selection model under the circumstance of missing labels. With the proposed algorithm, the most discriminative features are selected and missing labels are recovered simultaneously. To remove the irrelevant and noisy features, the effective l2, p-norm (0

[1]  Newton Spolaôr,et al.  Filter Approach Feature Selection Methods to Support Multi-label Learning Based on ReliefF and Information Gain , 2012, SBIA.

[2]  Jie Duan,et al.  Multi-label feature selection based on neighborhood mutual information , 2016, Appl. Soft Comput..

[3]  Jiebo Luo,et al.  Learning multi-label scene classification , 2004, Pattern Recognit..

[4]  Dae-Won Kim,et al.  Feature selection for multi-label classification using multivariate mutual information , 2013, Pattern Recognit. Lett..

[5]  刘景华,et al.  Multi-label feature selection based on max-dependency and min-redundancy , 2015 .

[6]  Baoyuan Wu,et al.  Constrained Submodular Minimization for Missing Labels and Class Imbalance in Multi-label Learning , 2016, AAAI.

[7]  Zhi-Hua Zhou,et al.  Multilabel Neural Networks with Applications to Functional Genomics and Text Categorization , 2006, IEEE Transactions on Knowledge and Data Engineering.

[8]  Rong Jin,et al.  Efficient multi-label ranking for multi-class learning: Application to object recognition , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[9]  Inderjit S. Dhillon,et al.  Large-scale Multi-label Learning with Missing Labels , 2013, ICML.

[10]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[11]  Khalid Benabdeslem,et al.  Soft-constrained Laplacian score for semi-supervised multi-label feature selection , 2015, Knowledge and Information Systems.

[12]  Liang-Tien Chia,et al.  Concurrent Single-Label Image Classification and Annotation via Efficient Multi-Layer Group Sparse Coding , 2014, IEEE Transactions on Multimedia.

[13]  Yoram Singer,et al.  BoosTexter: A Boosting-based System for Text Categorization , 2000, Machine Learning.

[14]  Nicu Sebe,et al.  Web Image Annotation Via Subspace-Sparsity Collaborated Feature Selection , 2012, IEEE Transactions on Multimedia.

[15]  Geoffrey I. Webb,et al.  Faster and more accurate classification of time series by exploiting a novel dynamic time warping averaging algorithm , 2015, Knowledge and Information Systems.

[16]  Feiping Nie,et al.  New Graph Structured Sparsity Model for Multi-label Image Annotations , 2013, 2013 IEEE International Conference on Computer Vision.

[17]  Chris H. Q. Ding,et al.  Multi-label Linear Discriminant Analysis , 2010, ECCV.

[18]  Qiang Ji,et al.  Multi-label learning with missing labels for image annotation and facial action unit recognition , 2015, Pattern Recognit..

[19]  Volker Tresp,et al.  Multi-label informed latent semantic indexing , 2005, SIGIR '05.

[20]  Bo Wang,et al.  Multi-Instance Multi-Label Learning Combining Hierarchical Context and its Application to Image Annotation , 2016, IEEE Transactions on Multimedia.

[21]  Anil K. Jain,et al.  Feature Selection: Evaluation, Application, and Small Sample Performance , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Feiping Nie,et al.  Efficient and Robust Feature Selection via Joint ℓ2, 1-Norms Minimization , 2010, NIPS.

[23]  Xin Li,et al.  Conditional Restricted Boltzmann Machines for Multi-label Learning with Incomplete Labels , 2015, AISTATS.

[24]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[25]  Jason Weston,et al.  A kernel method for multi-labelled classification , 2001, NIPS.

[26]  Thorsten Joachims,et al.  Text Categorization with Support Vector Machines: Learning with Many Relevant Features , 1998, ECML.

[27]  Grigorios Tsoumakas,et al.  MULAN: A Java Library for Multi-Label Learning , 2011, J. Mach. Learn. Res..

[28]  Víctor Robles,et al.  Feature selection for multi-label naive Bayes classification , 2009, Inf. Sci..

[29]  Zhen Wang,et al.  Learning Low-Rank Label Correlations for Multi-label Classification with Missing Labels , 2014, 2014 IEEE International Conference on Data Mining.

[30]  Kilian Q. Weinberger,et al.  Fast Image Tagging , 2013, ICML.

[31]  Rong Jin,et al.  Multi-label learning with incomplete class assignments , 2011, CVPR 2011.

[32]  Zhi-Hua Zhou,et al.  Multilabel dimensionality reduction via dependence maximization , 2008, TKDD.

[33]  Zhi-Hua Zhou,et al.  ML-KNN: A lazy learning approach to multi-label learning , 2007, Pattern Recognit..

[34]  Naonori Ueda,et al.  Parametric Mixture Models for Multi-Labeled Text , 2002, NIPS.

[35]  Geoff Holmes,et al.  Classifier Chains for Multi-label Classification , 2009, ECML/PKDD.

[36]  Zhi-Hua Zhou,et al.  Multi-Label Learning with Weak Label , 2010, AAAI.

[37]  Qinghua Hu,et al.  Robust Multi-label Feature Selection with Missing Labels , 2016, CCPR.

[38]  Min-Ling Zhang,et al.  A Review on Multi-Label Learning Algorithms , 2014, IEEE Transactions on Knowledge and Data Engineering.

[39]  Jiawei Han,et al.  Correlated multi-label feature selection , 2011, CIKM '11.

[40]  Qiang Yang,et al.  Document Transformation for Multi-label Feature Selection in Text Categorization , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).

[41]  Dae-Won Kim,et al.  Memetic feature selection algorithm for multi-label classification , 2015, Inf. Sci..

[42]  Qinghua Hu,et al.  Multi-label feature selection with streaming labels , 2016, Inf. Sci..