Multi-label Classification Based on Adaptive Resonance Theory

This paper proposes a multi-label classification algorithm based on an algorithm adaptation approach by applying the Adaptive Resonance Theory (ART) and the Bayesian approach for a label association process. In the proposed algorithm, the prior probability and likelihood are updated sequentially. Moreover, an ART-based clustering algorithm continually extracts useful information for multi-label classification, and holds the extracted information on prototype nodes generated by the clustering algorithm. Thanks to the above properties, the proposed algorithm can continually learn multi-label data. Our experimental results in this paper show that the proposed algorithm has better classification performance compared to typical multi-label classification algorithms.

[1]  Hisao Ishibuchi,et al.  Fast Topological Adaptive Resonance Theory Based on Correntropy Induced Metric , 2019, 2019 IEEE Symposium Series on Computational Intelligence (SSCI).

[2]  Zhi-Hua Zhou,et al.  ML-KNN: A lazy learning approach to multi-label learning , 2007, Pattern Recognit..

[3]  F. Wilcoxon Individual Comparisons by Ranking Methods , 1945 .

[4]  Fernando Benites,et al.  Multi-label classification by ART-based neural networks and hierarchy extraction , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[5]  Ricardo Cerri,et al.  A self-organizing map-based method for multi-label classification , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[6]  Fernando Benites,et al.  Improving scalability of ART neural networks , 2017, Neurocomputing.

[7]  Chu Kiong Loo,et al.  Kernel Bayesian ART and ARTMAP , 2018, Neural Networks.

[8]  Stephen Grossberg,et al.  Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps , 1992, IEEE Trans. Neural Networks.

[9]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[10]  Shifei Ding,et al.  Multi layer ELM-RBF for multi-label learning , 2016, Appl. Soft Comput..

[11]  Xindong Wu,et al.  Neighbor selection for multilabel classification , 2016, Neurocomputing.

[12]  Chu Kiong Loo,et al.  A Kernel Bayesian Adaptive Resonance Theory with A Topological Structure , 2019, Int. J. Neural Syst..

[13]  Saso Dzeroski,et al.  Multi-label classification via multi-target regression on data streams , 2016, Machine Learning.

[14]  Charu C. Aggarwal,et al.  Recurring and Novel Class Detection Using Class-Based Ensemble for Evolving Data Stream , 2016, IEEE Transactions on Knowledge and Data Engineering.

[15]  Elena P. Sapozhnikova,et al.  Multi-label Classification with ART Neural Networks , 2009, 2009 Second International Workshop on Knowledge Discovery and Data Mining.

[16]  Christopher F. Parmeter,et al.  Normal reference bandwidths for the general order, multivariate kernel density derivative estimator , 2012 .

[17]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[18]  Claudio Gentile,et al.  Incremental Algorithms for Hierarchical Classification , 2004, J. Mach. Learn. Res..

[19]  Hisao Ishibuchi,et al.  Topological Clustering via Adaptive Resonance Theory With Information Theoretic Learning , 2019, IEEE Access.

[20]  Grigorios Tsoumakas,et al.  MULAN: A Java Library for Multi-Label Learning , 2011, J. Mach. Learn. Res..

[21]  Grigorios Tsoumakas,et al.  Multi-Label Classification: An Overview , 2007, Int. J. Data Warehous. Min..

[22]  Amanda Clare,et al.  Knowledge Discovery in Multi-label Phenotype Data , 2001, PKDD.

[23]  Stephen Grossberg,et al.  Competitive Learning: From Interactive Activation to Adaptive Resonance , 1987, Cogn. Sci..

[24]  Jong-Hwan Kim,et al.  Incremental Class Learning for Hierarchical Classification , 2020, IEEE Transactions on Cybernetics.