Improving the generalization ability of neural networks by interval arithmetic

Recently, interval-arithmetic-based neural networks have been proposed for handling intervals as inputs of multilayer feedforward neural networks. This paper demonstrates that interval arithmetic can be utilized for improving the generalization ability of neural networks for pattern classification problems. We examine two approaches, each of which is used in the classification phase of new patterns and in the learning phase of neural networks, respectively. In the first approach, an interval input vector is generated from a new pattern by adding a certain width to its attribute values. In the second approach, neural networks are trained by interval input vectors generated from training patterns. These approaches are illustrated by a two-dimensional pattern classification problem. The effectiveness of these approaches is examined by computer simulations on a commonly used benchmark data set.

[1]  K. Nakayama,et al.  Interval arithmetic backpropagation , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[2]  Hideo Tanaka,et al.  Neural Networks with Interval Weights for Nonlinear Mappings of Interval Vectors (Special Issue on Neurocomputing) , 1994 .

[3]  Hideo Tanaka,et al.  An extension of the BP-algorithm to interval input vectors-learning from numerical data and expert's knowledge , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[4]  Ramon E. Moore Methods and applications of interval analysis , 1979, SIAM studies in applied mathematics.

[5]  Hideo Tanaka,et al.  Learning from incomplete training data with missing values and medical application , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).