A Multi-label Ensemble Method Based on Minimum Ranking Margin Maximization

Multi-label classification is a learning task of predicting a set of target labels for a given example. In this paper, we propose an ensemble method for multi-label classification, which is designed to optimize a novel minimum ranking margin objective function. Moreover, a boosting-type strategy is adopted to construct an accurate multi-label ensemble from multiple weak base classifiers. Experiments on different real-world multi-label classification tasks show that better performance can be achieved compared to other well-established methods.

[1]  Jason Weston,et al.  A kernel method for multi-labelled classification , 2001, NIPS.

[2]  Grigorios Tsoumakas,et al.  Mining Multi-label Data , 2010, Data Mining and Knowledge Discovery Handbook.

[3]  Yang Yu,et al.  Multi-label hypothesis reuse , 2012, KDD.

[4]  J. Kittler Image processing for remote sensing , 1983, Philosophical Transactions of the Royal Society of London. Series A, Mathematical and Physical Sciences.

[5]  Tibério S. Caetano,et al.  Reverse Multi-Label Learning , 2010, NIPS.

[6]  Xin Li,et al.  Multi-label Image Classification with A Probabilistic Label Enhancement Model , 2014, UAI.

[7]  Saso Dzeroski,et al.  An extensive experimental comparison of methods for multi-label learning , 2012, Pattern Recognit..

[8]  Grigorios Tsoumakas,et al.  Dealing with Concept Drift and Class Imbalance in Multi-Label Stream Classification , 2011, IJCAI.

[9]  Lior Rokach,et al.  Data Mining And Knowledge Discovery Handbook , 2005 .

[10]  Jiebo Luo,et al.  Learning multi-label scene classification , 2004, Pattern Recognit..

[11]  Tian Xia,et al.  A multi-class boosting method with direct optimization , 2014, KDD.

[12]  Kun Zhang,et al.  Multi-label learning by exploiting label dependency , 2010, KDD.

[13]  Grigorios Tsoumakas,et al.  Multi-Label Classification: An Overview , 2007, Int. J. Data Warehous. Min..

[14]  Tian Xia,et al.  Direct 0-1 Loss Minimization and Margin Maximization with Boosting , 2013, NIPS.

[15]  Alex M. Andrew,et al.  Boosting: Foundations and Algorithms , 2012 .

[16]  Wei Xue,et al.  Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence Probabilistic Multi-Label Classification with Sparse Feature Learning , 2022 .

[17]  Yoram Singer,et al.  BoosTexter: A Boosting-based System for Text Categorization , 2000, Machine Learning.

[18]  Grigorios Tsoumakas,et al.  Random K-labelsets for Multilabel Classification , 2022 .

[19]  Philip S. Yu,et al.  Multi-label Ensemble Learning , 2011, ECML/PKDD.

[20]  Zhi-Hua Zhou,et al.  ML-KNN: A lazy learning approach to multi-label learning , 2007, Pattern Recognit..

[21]  P. Tseng Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization , 2001 .

[22]  Geoff Holmes,et al.  Multi-label Classification Using Ensembles of Pruned Sets , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[23]  Yoram Singer,et al.  Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers , 2000, J. Mach. Learn. Res..

[24]  Geoff Holmes,et al.  Classifier chains for multi-label classification , 2009, Machine Learning.

[25]  Min-Ling Zhang,et al.  A Review on Multi-Label Learning Algorithms , 2014, IEEE Transactions on Knowledge and Data Engineering.

[26]  Philip S. Yu,et al.  Multi-label Feature Selection for Graph Classification , 2010, 2010 IEEE International Conference on Data Mining.