Robust design of multilayer feedforward neural networks: an experimental approach

Abstract Artificial neural networks (ANNs) have been successfully used for solving a wide variety of problems. However, determining a suitable set of structural and learning parameter values for an ANN still remains a difficult task. This article is concerned with the robust design of multilayer feedforward neural networks trained by backpropagation algorithm (Backpropagation net, BPN) and develops a systematic, experimental strategy which emphasizes simultaneous optimization of BPN parameters under various noise conditions. Unlike previous works, the present robust design problem is formulated as a Taguchi's dynamic parameter design problem, together with a fine-tuning of the BPN output when necessary. A series of computational experiments are also conducted using the data sets from various sources. From the computational results, statistically significant effects of the BPN parameters on the robustness measure (i.e., signal-to-noise ratio) are identified, based upon which an economical experimental strategy is derived. It is also shown that fine-tuning the BPN output is effective in improving the signal-to-noise ratio. Finally, the step-by-step procedures for implementing the proposed approach are illustrated with an example.

[1]  Karen A. F. Copeland Design and Analysis of Experiments, 5th Ed. , 2001 .

[2]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[3]  Robert Hecht-Nielsen,et al.  Theory of the backpropagation neural network , 1989, International 1989 Joint Conference on Neural Networks.

[4]  J. C. Chen,et al.  Evaluation of fuzzy-nets training efficiency , 1996, Soft Computing in Intelligent Systems and Information Processing. Proceedings of the 1996 Asian Fuzzy Systems Symposium.

[5]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[6]  Teuvo Kohonen,et al.  The self-organizing map , 1990, Neurocomputing.

[7]  Siang Kok Sim,et al.  Training the neocognitron network using design of experiments , 1995, Artif. Intell. Eng..

[8]  Daniel C. St. Clair,et al.  Using Taguchi's method of experimental design to control errors in layered perceptrons , 1995, IEEE Trans. Neural Networks.

[9]  S. Kung,et al.  VLSI Array processors , 1985, IEEE ASSP Magazine.

[10]  J. Brian Gray,et al.  Introduction to Linear Regression Analysis , 2002, Technometrics.

[11]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[12]  Madhan Shridhar Phadke,et al.  Quality Engineering Using Robust Design , 1989 .

[13]  B. S. Lim,et al.  Optimal design of neural networks using the Taguchi method , 1995, Neurocomputing.

[14]  T. Samad,et al.  Genetic optimization of self-organizing feature maps , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[15]  P. Young,et al.  Time series analysis, forecasting and control , 1972, IEEE Transactions on Automatic Control.

[16]  Michael Sylvester Packianather,et al.  Optimizing the parameters of multilayered feedforward neural networks through Taguchi design of experiments , 2000 .

[17]  Tirupathi R. Chandrupatla Quality and reliability in engineering , 1997 .

[18]  T. Y. Lin,et al.  Optimum design for artificial neural networks: an example in a bicycle derailleur system , 2000 .

[19]  田口 玄一,et al.  Taguchi methods : signal-to-noise ratio for quality evaluation , 1991 .

[20]  Kate A. Smith,et al.  Applications of the Growing Self Organizing Map on high dimensional data , 2004 .

[21]  X. Yao Evolving Artificial Neural Networks , 1999 .

[22]  Shih-Ming Yang,et al.  Neural Network Design by Using Taguchi Method , 1999 .

[23]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[24]  Thomas Villmann,et al.  Applications of the growing self-organizing map , 1998, Neurocomputing.

[25]  Robert M. Pap,et al.  Handbook of neural computing applications , 1990 .

[26]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.