Neural networks have proven very useful in the field of pattern classification by mapping input patterns into one of several categories. One widely used neural network paradigm is the multi- layer perceptron employing back-propagation of errors learning -- often called back- propagation networks (BPNs). Rather than being specifically programmed, BPNs `learn' this mapping by exposure to a training set, a collection of input pattern samples matched with their corresponding output classification. The proper construction of this training set is crucial to successful training of a BPN. One of the criteria to be met for proper construction of a training set is that each of the classes must be adequately represented. A class that is represented less often in the training data may not be learned as completely or correctly, impairing the network's discrimination ability. This is due to the implicit setting of a priori probabilities which results from unequal sample sizes. The degree of impairment is a function of (among other factors) the relative number of samples of each class used for training. This paper addresses the problem of unequal representation in training sets by proposing two alternative methods of learning. One adjusts the learning rate for each class to achieve user- specified goals. The other utilizes a genetic algorithm to set the connection weights with a fitness function based on these same goals. These methods are tested using both artificial and real-world training data.
[1]
R. Hecht-Nielsen,et al.
Theory of the Back Propagation Neural Network
,
1989
.
[2]
Peter M. Todd,et al.
Designing Neural Networks using Genetic Algorithms
,
1989,
ICGA.
[3]
David E. Goldberg,et al.
Zen and the Art of Genetic Algorithms
,
1989,
ICGA.
[4]
Godfried T. Toussaint,et al.
Bibliography on estimation of misclassification
,
1974,
IEEE Trans. Inf. Theory.
[5]
R. K. M. Cheung,et al.
Relative effectiveness of training set patterns for backpropagation
,
1990,
1990 IJCNN International Joint Conference on Neural Networks.
[6]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[7]
John H. Holland,et al.
Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence
,
1992
.
[8]
Tariq Samad,et al.
Towards the Genetic Synthesisof Neural Networks
,
1989,
ICGA.