Closed loop optimization of features for neural classifiers

The selection and preprocessing of features are crucial to the success of a classifier for pattern recognition applications. The preprocessing often involves filters, transformations and non-linear processing of the raw data. Since the training data required is an exponential function of the number of features, a reduction or transformation of the features is essential. While it is frequently possible to heuristically select reasonable values pertaining to the selection of these parameters, an automated approach could be of great value in different application areas. Various factors relating to the optimization process are described and the results of continuous wavelet based optimization on seismic buffer recognition are described.

[1]  Danny Coomans,et al.  Classification Using Adaptive Wavelets for Feature Extraction , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Etienne Barnard,et al.  Optimization for training neural nets , 1992, IEEE Trans. Neural Networks.

[3]  Etienne Barnard,et al.  Avoiding false local minima by proper initialization of connections , 1992, IEEE Trans. Neural Networks.

[4]  Roberto Battiti,et al.  Training neural nets with the reactive tabu search , 1995, IEEE Trans. Neural Networks.

[5]  Roberto Battiti,et al.  Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.

[6]  A.J. Hoffman,et al.  Seismic buffer recognition using mutual information for selecting wavelet based features , 1998, IEEE International Symposium on Industrial Electronics. Proceedings. ISIE'98 (Cat. No.98TH8357).

[7]  S. Yao Signal-adapted wavelet filter bank design , 1998 .