A local approach for a fuzzy error function used in multilayer perceptron training through genetic algorithm
暂无分享,去创建一个
We previously (1995) proposed a new error function tailored to train one hidden-layer perceptron for classification problems, using a two-stage mechanism (splitting up the input space into regions which are then combined according to the most representative class of the encompassed patterns). This crisp error function relies on the number of patterns of different classes in a same region. In this paper, we enhance this function by incorporating a fuzzy component which allows all the region boundaries to tend to an optimal Bayesian separation. Genetic algorithms were previously involved in computing the weights of the network. A genetic enhancement is here introduced with a phenotypic based operator, through fresh blood strategies. This method takes advantage of coarse a priori knowledge about the problem to drive more efficiently the population evolution. Numerous tests using our new approach show a dramatic additional improvement compared to the already efficient former one. Interesting perspectives emerge from these performance results associated with the relevant knowledge offered by the region boundaries.
[1] Zbigniew Michalewicz,et al. Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.
[2] K. Lang,et al. Learning to tell two spirals apart , 1988 .
[3] Günter Rudolph,et al. Convergence analysis of canonical genetic algorithms , 1994, IEEE Trans. Neural Networks.