Search and global minimization in similarity-based methods

The class of similarity based methods (SBM) covers most neural models and many other classifiers. Performance of such methods is significantly improved if irrelevant features are removed and feature weights introduced, scaling their influence on calculation of similarity. Several methods for feature selection and weighting are described. As an alternative to the global minimization procedures computationally efficient best-first search methods are advocated. Although these methods can be used with any SBM classifier they have been tested using the k-NN method since it is relatively fast and for some databases gives excellent results. A few illustrative examples show significant improvements due to the feature weighting and selection.

[1]  David W. Aha,et al.  Weighting Features , 1995, ICCBR.

[2]  Sankar K. Pal,et al.  Knowledge-based fuzzy MLP for classification and rule generation , 1997, IEEE Trans. Neural Networks.

[3]  Yoichi Hayashi,et al.  Fuzzy and Crisp Logical Rule Extraction Methods in Application to Medical Data , 2000 .

[4]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[5]  W. Dich,et al.  Minimal distance neural methods , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[6]  David W. Aha,et al.  Feature Weighting for Lazy Learning Algorithms , 1998 .

[7]  Włodzisław Duch,et al.  Neural minimal distance methods , 1997 .

[8]  Włodzisław Duch A framework for similarity-based classification methods , 2000 .

[9]  Kuolin Hsu,et al.  Superior training of artificial neural networks using weight-space partitioning , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[10]  Wlodzislaw Duch,et al.  THE WEIGHTED k-NN WITH SELECTION OF FEATURES AND ITS NEURAL REALIZATION , 1999 .

[11]  Sholom M. Weiss,et al.  An Empirical Comparison of Pattern Recognition, Neural Nets, and Machine Learning Classification Methods , 1989, IJCAI.

[12]  Lester Ingber,et al.  Adaptive simulated annealing (ASA): Lessons learned , 2000, ArXiv.

[13]  Karol Grudzinski,et al.  A framework for similarity-based methods. , 1998 .

[14]  Wlodzislaw Duch,et al.  Hybrid Neural-global Minimization Method of Logical Rule Extraction , 1999, J. Adv. Comput. Intell. Intell. Informatics.

[15]  Hiroshi Motoda,et al.  Feature Extraction, Construction and Selection: A Data Mining Perspective , 1998 .

[16]  Michal Morciniec,et al.  A Theoretical and Experimental Account of n-Tuple Classifier Performance , 1996, Neural Computation.

[17]  Vipin Kumar,et al.  Search in Artificial Intelligence , 1988, Symbolic Computation.

[18]  Huan Liu,et al.  Book review: Machine Learning, Neural and Statistical Classification Edited by D. Michie, D.J. Spiegelhalter and C.C. Taylor (Ellis Horwood Limited, 1994) , 1996, SGAR.

[19]  Visakan Kadirkamanathan,et al.  Statistical Control of RBF-like Networks for Classification , 1997, ICANN.