An active learning approach for radial basis function neural networks

Kertas kerja ini membentangkan satu kaedah Pembelajaran Aktif yang baru untuk melatih Jaringan Saraf Buatan ( JSB) yang berasaskan Fungsi Asas Jejarian (FAJ) apabila JSB tersebut digunakan untuk menyelesaikan masalah Penurunan Model. Kaedah baru ini berasaskan andaian bahawa data yang diperlukan, y , pada input x , berada dalam sebuah set di mana F ( x ) boleh dibentuk menggunakan pengalaman atau pengetahuan awal tentang satu masalah. Kaedah baru ini akan mendapatkan lokasi data baru dengan meminimumkan ralat kes paling buruk antara keluaran JSB dengan had data seperti yang telah ditakrifkan oleh set F ( x ). Adalah didapati bahawa kaedah yang dicadangkan ini mampu memberikan kedudukan data baru yang baik pada kes-kes tertentu, berbanding dengan data yang diperolehi daripada kaedah sedia ada. Hasil kajian perbandingan antara kaedah yang dicadangkan dengan kaedah yang sedia ada juga disertakan dalam kertas kerja ini yang menunjukkan bahawa kaedah pembelajaran aktif yang dicadangkan merupakan satu penambahan yang baik kepada kaedah pembelajaran aktif yang sedia ada seperti kaedah reka bentuk maksimum minimum atau kaedah cross validation . Kata kunci: Jaringan saraf buatan, fungsi asas jejarian, penurunan model, kaedah pembelajaran aktif, reka bentuk eksperimen, metamodel This paper presents a new Active Learning algorithm to train Radial Basis Function (RBF) Artificial Neural Networks (ANN) for model reduction problems. The new approach is based on the assumption that the unobserved training data y at input x , lies within a set where F ( x ) is known from experience or past simulations. The new approach finds the location of the new sample such that the worst case error between the output of the resulting RBF ANN and the bounds of the unknown data as specified by F ( x ) is minimized. This paper illustrates the new approach for the case when . It was found that it is possible to find a good location for the new data sample by using the suggested approach in certain cases. A comparative study was also done indicating that the new experiment design approach is a good complement to the existing ones such as cross validation design and maximum minimum design. Key words: Artificial neural networks, radial basis functions, model reduction, active learning, experiment design, metamodeling

[1]  E. M. Freeman,et al.  A comparison of two generalized response surface methods for optimisation in electromagnetics , 2001 .

[2]  M. E. Johnson,et al.  Minimax and maximin distance designs , 1990 .

[3]  P. Laycock,et al.  Optimum Experimental Designs , 1995 .

[4]  Jay D. Martin,et al.  USE OF ADAPTIVE METAMODELING FOR DESIGN OPTIMIZATION , 2002 .

[5]  Tatsuo Itoh,et al.  An unconditionally stable extended (USE) finite-element time-domain solution of active nonlinear microwave circuits using perfectly matched layers , 2001, IMS 2001.

[6]  Kenneth W. Bauer,et al.  Metamodelling techniques in multidimensional optimality analysis for linear programming , 1996 .

[7]  J. Kleijnen Statistical tools for simulation practitioners , 1986 .

[8]  Leonard G. C. Hamey,et al.  Minimisation of data collection by active learning , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[9]  Anthony C. Atkinson,et al.  Optimum Experimental Designs , 1992 .

[10]  Douglas C. Montgomery,et al.  Response Surface Methodology: Process and Product Optimization Using Designed Experiments , 1995 .

[11]  Byoung-Tak Zhang,et al.  Neural networks that teach themselves through genetic discovery of novel examples , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[12]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[13]  David A. Cohn,et al.  Neural Network Exploration Using Optimal Experiment Design , 1993, NIPS.

[14]  Shahrum Shah bin Abdullah Experiment design for deterministic model reduction and neural network training , 2003 .

[15]  Ruichen Jin,et al.  On Sequential Sampling for Global Metamodeling in Engineering Design , 2002, DAC 2002.

[16]  Ivica Kostanic,et al.  Principles of Neurocomputing for Science and Engineering , 2000 .

[17]  M. H. Choueiki,et al.  Training data development with the D-optimality criterion , 1999, IEEE Trans. Neural Networks.

[18]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[19]  Kenji Fukumizu,et al.  Statistical active learning in multilayer perceptrons , 2000, IEEE Trans. Neural Networks Learn. Syst..