Evolutionary selection of neural networks satisfying leave-one-out criteria

Non parametric inference error, the error arising from estimating the regression function based on a labeled set of training examples could be divided into two main contributions: the bias and the variance. Neural network is one of the existing models in non parametric inference whose bias/variance trade off is hidden below the network architecture. In recent years new and powerful tools for neural networks selection were invented to impact the bias variance dilemma and the results in the implemented solution were satisfying [11,12]. We exploited the new measures introduced in these works for implementing a genetic algorithm to train neural networks. This method enables a reliable generalization error estimation for neural model. Estimating the error performance permits to drive correctly the genetic evolution that will lead to a fitting model with the desired characteristics. After a brief description of the estimation technique we used the genetic algorithm implementation for artificial data as a test. Finally the results of the fully automatic algorithm for NN training and model selection applied to investigation of defect structure of semi-insulating materials based on photo-induced transient spectroscopy experiments.

[1]  P. Kamiński,et al.  Implementation of Neural Network Method to Investigate Defect Centers in Semi-Insulating Materials , 2002 .

[2]  James L. McClelland Parallel Distributed Processing , 2005 .

[3]  Gaetan Monari Sélection de modèles non linéaires par "leave-one-out": étude théorique et application des réseaux de neurones au procédé de soudage par points , 1999 .

[4]  Hak-Keung Lam,et al.  Tuning of the structure and parameters of a neural network using an improved genetic algorithm , 2003, IEEE Trans. Neural Networks.

[5]  J. S. F. Barker,et al.  Simulation of Genetic Systems by Automatic Digital Computers , 1958 .

[6]  Bruce Curry,et al.  Model selection in Neural Networks: Some difficulties , 2006, Eur. J. Oper. Res..

[7]  Léon Personnaz,et al.  Jacobian Conditioning Analysis for Model Validation , 2004, Neural Computation.

[8]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[9]  Darrell Whitley,et al.  A genetic algorithm tutorial , 1994, Statistics and Computing.

[10]  Léon Personnaz,et al.  Nonlinear internal model control using neural networks: application to processes with delay and design issues , 2000, IEEE Trans. Neural Networks Learn. Syst..

[11]  Gérard Dreyfus,et al.  Local Overfitting Control via Leverages , 2002, Neural Computation.

[12]  G.S. May,et al.  Optimization of neural network structure and learning parameters using genetic algorithms , 1996, Proceedings Eighth IEEE International Conference on Tools with Artificial Intelligence.

[13]  Léon Personnaz,et al.  Neural-network construction and selection in nonlinear modeling , 2003, IEEE Trans. Neural Networks.

[14]  Gérard Dreyfus,et al.  Withdrawing an example from the training set: An analytic estimation of its effect on a non-linear parameterised model , 2000, Neurocomputing.

[15]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.

[16]  Nils Aall Barricelli,et al.  Numerical testing of evolution theories , 1963 .