Pruning with interval arithmetic perceptron

Abstract In this paper we present a training algorithm for the interval arithmetic perceptron (IAP), i.e. a perceptron which uses interval weights, and describe its use in input pruning. The algorithm is based on the consideration that a zero-value can be assigned to a weight corresponding to an interval with a negative lower value and a positive upper value. Our procedure has been tested on Iris, Breast Cancer and Sonar databases, showing that many input features are unnecessary for a satisfactory classification performance. Comparison with a well-established feature screening method showed good agreement, but also revealed some differences due to the fact that IAP is particularly well suited to classification problems.

[1]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[2]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[3]  H. Ishibuchi,et al.  An architecture of neural networks with interval weights and its application to fuzzy regression analysis , 1993 .

[4]  Kenneth W. Bauer,et al.  Feature saliency measures , 1997 .

[5]  Kenneth W. Bauer,et al.  Improved feature screening in feedforward neural networks , 1996, Neurocomputing.

[6]  Terrence J. Sejnowski,et al.  Analysis of hidden units in a layered network trained to classify sonar targets , 1988, Neural Networks.

[7]  Kenneth W. Bauer,et al.  Integrated feature architecture selection , 1996, IEEE Trans. Neural Networks.

[8]  Davide Anguita,et al.  Incorporating a priori knowledge into neural networks , 1995 .

[9]  Sandro Ridella,et al.  Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithmCorrigenda for this article is available here , 1987, TOMS.

[10]  G. Alefeld,et al.  Introduction to Interval Computation , 1983 .

[11]  David Haussler,et al.  What Size Net Gives Valid Generalization? , 1989, Neural Computation.

[12]  Markus Höhfeld,et al.  Learning with limited numerical precision using the cascade-correlation algorithm , 1992, IEEE Trans. Neural Networks.

[13]  Sandro Ridella,et al.  Statistically controlled activation weight initialization (SCAWI) , 1992, IEEE Trans. Neural Networks.