Convergence and application of online active sampling using orthogonal pillar vectors

The analysis of convergence and its application is shown for the Active Sampling-at-the-Boundary method applied to multidimensional space using orthogonal pillar vectors. Active learning method facilitates identifying an optimal decision boundary for pattern classification in machine learning. The result of this method is compared with the standard active learning method that uses random sampling on the decision boundary hyperplane. The comparison is done through simulation and application to the real-world data from the UCI benchmark data set. The boundary is modeled as a nonseparable linear decision hyperplane in multidimensional space with a stochastic oracle.

[1]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[2]  David J. C. MacKay,et al.  Information-Based Objective Functions for Active Data Selection , 1992, Neural Computation.

[3]  K. Fukumizu Active Learning in Multilayer , 2000 .

[4]  Neil Genzlinger A. and Q , 2006 .

[5]  Kenji Fukumizu,et al.  Statistical active learning in multilayer perceptrons , 2000, IEEE Trans. Neural Networks Learn. Syst..

[6]  Gunnar Rätsch,et al.  Active Learning in the Drug Discovery Process , 2001, NIPS.

[7]  Jong-Min Park,et al.  Active feature selection in optic nerve data using support vector machine , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[8]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[9]  W. J. Studden,et al.  Theory Of Optimal Experiments , 1972 .

[10]  Sollich Learning from minimum entropy queries in a large committee machine. , 1996, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[11]  Jenq-Neng Hwang,et al.  Query-based learning applied to partially trained multilayer perceptrons , 1991, IEEE Trans. Neural Networks.

[12]  David A. Bell,et al.  Axiomatic Approach to Feature Subset Selection Based on Relevance , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Daphne Koller,et al.  Support Vector Machine Active Learning with Applications to Text Classification , 2000, J. Mach. Learn. Res..

[14]  Wolfgang Kinzel,et al.  Improving a Network Generalization Ability by Selecting Examples , 1990 .

[15]  Daphne Koller,et al.  Support Vector Machine Active Learning with Application sto Text Classification , 2000, ICML.

[16]  David A. Cohn,et al.  Neural Network Exploration Using Optimal Experiment Design , 1993, NIPS.

[17]  Greg Schohn,et al.  Less is More: Active Learning with Support Vector Machines , 2000, ICML.

[18]  Y. Kabashima,et al.  Incremental learning with and without queries in binary choice problems , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[19]  Nello Cristianini,et al.  Query Learning with Large Margin Classi ersColin , 2000 .

[20]  Samuel Karlin,et al.  A First Course on Stochastic Processes , 1968 .

[21]  Jong-Min Park,et al.  Analysis of active feature selection in optic nerve data using labeled fuzzy C-means clustering , 2002, 2002 IEEE World Congress on Computational Intelligence. 2002 IEEE International Conference on Fuzzy Systems. FUZZ-IEEE'02. Proceedings (Cat. No.02CH37291).

[22]  T. Watkin,et al.  Selecting examples for perceptrons , 1992 .

[23]  David A. Cohn,et al.  Training Connectionist Networks with Queries and Selective Sampling , 1989, NIPS.

[24]  Yu Hen Hu,et al.  On-line learning for active pattern recognition , 1996, IEEE Signal Processing Letters.