Entropic Feature Discrimination Ability for Pattern Classification Based on Neural IAL

Incremental Attribute Learning (IAL) is a novel machine learning strategy, where features are gradually trained in one or more according to some orderings. In IAL, feature ordering is a special preprocessing. Apart from time-consuming contribution-based feature ordering methods, feature ordering also can be derived by filter criteria. In this paper, a novel criterion based on Discriminability, a distribution-based metric, and Entropy is presented to give ranks for feature ordering, which has been validated in both two-category and multivariable classification problems by neural networks. Final experimental results show that the new metric is not only applicable for IAL, but also able to obtain better performance in lower error rates.

[1]  Steven Guan,et al.  Incremental Learning with Respect to New Incoming Input Attributes , 2004, Neural Processing Letters.

[2]  R. K. Agrawal,et al.  Incremental Bayesian classification for multivariate normal distribution data , 2008, Pattern Recognit. Lett..

[3]  Jun Liu,et al.  Feature Selection for Modular Networks Based on Incremental Training , 2005 .

[4]  Jose Miguel Puerta,et al.  Fast wrapper feature subset selection in high-dimensional datasets by means of filter re-ranking , 2012, Knowl. Based Syst..

[5]  Jun Liu,et al.  Incremental Neural Network Training with an Increasing Input Dimension , 2004 .

[6]  Steven Guan,et al.  Parallel growing and training of neural networks using output parallelism , 2002, IEEE Trans. Neural Networks.

[7]  Jun Liu,et al.  Incremental Ordered Neural Network Training , 2002 .

[8]  Fei Liu,et al.  Feature Discriminability for Pattern Classification Based on Neural Incremental Attribute Learning , 2011 .

[9]  Brian D. Ripley,et al.  Pattern Recognition and Neural Networks , 1996 .

[10]  Sholom M. Weiss,et al.  Predictive data mining - a practical guide , 1997 .

[11]  Shusaku Tsumoto,et al.  Foundations of Intelligent Systems, 15th International Symposium, ISMIS 2005, Saratoga Springs, NY, USA, May 25-28, 2005, Proceedings , 2005, ISMIS.

[12]  Steven Guan,et al.  Ordered incremental training for GA-based classifiers , 2005, Pattern Recognit. Lett..

[13]  Fuhui Long,et al.  Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Chris H. Q. Ding,et al.  Evolving Feature Selection , 2005, IEEE Intell. Syst..

[15]  Fai Wong,et al.  An incremental decision tree learning methodology regarding attributes in medical data mining , 2009, 2009 International Conference on Machine Learning and Cybernetics.