Feature Selection Using a Hybrid Associative Classifier with Masking Techniques

Performance in most pattern classifiers is improved when redundant or irrelevant features are removed, however, this is mainly achieved by successive classifiers construction. In this paper hybrid classification and masking techniques are presented as a new feature selection approach. The algorithm uses a hybrid classifier to provide a mask that identifies the optimal subset of features without having to compute a new classifier at each step. This method allows us to identify irrelevant or redundant features for classification purposes. Our results suggest that this method is shown to be a feasible way to identify optimal subset of features.

[1]  Karl Steinbuch,et al.  Nichtdigitale lernmatrizen als perzeptoren , 2004, Kybernetik.

[2]  Thomas G. Dietterich,et al.  Learning Boolean Concepts in the Presence of Many Irrelevant Features , 1994, Artif. Intell..

[3]  Teuvo Kohonen,et al.  Correlation Matrix Memories , 1972, IEEE Transactions on Computers.

[4]  Mineichi Kudo,et al.  Comparison of algorithms that select features for pattern classifiers , 2000, Pattern Recognit..

[5]  Abraham Kandel,et al.  Information-theoretic algorithm for feature selection , 2001, Pattern Recognit. Lett..

[6]  Günther Palm,et al.  Information and pattern capacities in neural associative memories with feedback for sparse memory patterns , 1992 .

[7]  Hongbin Zhang,et al.  Feature selection using tabu search method , 2002, Pattern Recognit..

[8]  James C. Bezdek,et al.  Pattern Recognition with Fuzzy Objective Function Algorithms , 1981, Advanced Applications in Pattern Recognition.

[9]  Thomas G. Dietterich,et al.  Learning with Many Irrelevant Features , 1991, AAAI.

[10]  Eduardo Gasca,et al.  Eliminating redundancy and irrelevance using a new MLP-based feature selection method , 2006, Pattern Recognit..

[11]  Robert A. Jacobs,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.

[12]  Mohamad H. Hassoun,et al.  Associative neural memories , 1993 .

[13]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[14]  Yair Shapira,et al.  Feature selection for multiple binary classification problems , 1999, Pattern Recognit. Lett..

[15]  M.H. Hassoun,et al.  Fundamentals of Artificial Neural Networks , 1996, Proceedings of the IEEE.

[16]  Shigeo Abe DrEng Pattern Classification , 2001, Springer London.

[17]  David G. Stork,et al.  Pattern Classification , 1973 .

[18]  Loris Nanni,et al.  Cluster-based pattern discrimination: A novel technique for feature selection , 2006, Pattern Recognit. Lett..

[19]  Ravi Kothari,et al.  Feature subset selection using a new definition of classifiability , 2003, Pattern Recognit. Lett..

[20]  Karl Steinbuch,et al.  Die Lernmatrix , 2004, Kybernetik.

[21]  F. Sommer,et al.  Neural Associative Memories , 1993 .

[22]  James A. Anderson,et al.  Neurocomputing: Foundations of Research , 1988 .

[23]  Peter Sussner,et al.  Morphological associative memories , 1998, IEEE Trans. Neural Networks.

[24]  Edward R. Dougherty,et al.  The coefficient of intrinsic dependence (feature selection using el CID) , 2005, Pattern Recognit..