Additional learning and forgetting by support vector machine and RBF networks

Radial basis function networks (RBFNs) have been widely applied to practical classification problems. In recent years, support vector machines (SVMs) are attracting researchers' interest as promising methods for classification problems. In this paper, we compare those two methods in view of additional learning and forgetting. The authors have reported that the additional learning and active forgetting in RBFNs provide a good performance for classification under the changeable environment. First, a method for additional learning and forgetting in SVMs is proposed. Next, a comparative simulation for a portfolio problems between RBFNs and SVMs is made.

[1]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[2]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[3]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[4]  Hirotaka Nakayama,et al.  Active forgetting in machine learning and its application to financial problems , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[5]  H. Nakayama,et al.  Additional learning and forgetting by potential method for pattern classification , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).