Active Learning for Optimal Generalization in Trigonometric Polynomial Models

In this paper, we consider the problem of active learning, and give a necessary and sufficient condition of sample points for the optimal generalization capability. By utilizing the properties of pseudo orthogonal bases, we clarify the mechanism of achieving the optimal generalization capability. We also show that the condition does not only provide the optimal generalization capability but also reduces the computational complexity and memory required for calculating learning result functions. Based on the optimality condition, we give design methods of optimal sample points for trigonometric polynomial models. Finally, the effectiveness of the proposed active learning method is demonstrated through computer simulations.

[1]  N. Aronszajn Theory of Reproducing Kernels. , 1950 .

[2]  W. J. Studden,et al.  Optimal Experimental Designs , 1966 .

[3]  R. Schatten,et al.  Norm Ideals of Completely Continuous Operators , 1970 .

[4]  W. J. Studden,et al.  Theory Of Optimal Experiments , 1972 .

[5]  Hidemitsu Ogawa,et al.  A Theory Of Pseudo-Orthogonal Bases And Its Application To Image Transmission , 1984, Optics + Photonics.

[6]  H. Ogawa,et al.  A unified approach to generalized sampling theorems , 1986, ICASSP '86. IEEE International Conference on Acoustics, Speech, and Signal Processing.

[7]  Itsuo Kumazawa,et al.  Radon transform and analog coding , 1991 .

[8]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[9]  Hidemitsu Ogawa,et al.  Neural network learning, generalization and over-learning , 1992 .

[10]  Ingrid Daubechies,et al.  Ten Lectures on Wavelets , 1992 .

[11]  David A. Cohn,et al.  Neural Network Exploration Using Optimal Experiment Design , 1993, NIPS.

[12]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[13]  Sollich Query construction, entropy, and generalization in neural-network models. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[14]  THEORY OF PSEUDO BIORTHOGONAL BASES AND ITS APPLICATION (Reproducing Kernels and their Applications) , 1998 .

[15]  Sethu Vijayakumar,et al.  Improving Generalization Ability through Active Learning , 1999 .

[16]  Masashi Sugiyama,et al.  Training Data Selection for Optimal Generalization in Trigonometric Polynomial Networks , 1999, NIPS.

[17]  F. J. Hickernell,et al.  ROBUST DESIGNS FOR FITTING LINEAR MODELS WITH MISSPECIFICATION , 1999 .

[18]  D. Larson,et al.  A module frame concept for Hilbert C*-modules , 2000, math/0011184.

[19]  Kenji Fukumizu,et al.  Statistical active learning in multilayer perceptrons , 2000, IEEE Trans. Neural Networks Learn. Syst..

[20]  Masashi Sugiyama,et al.  Incremental Active Learning for Optimal Generalization , 2000, Neural Computation.