Machine Learning and Data Mining in Pattern Recognition

We can face with the pattern recognition problems where the influence of hidden context leads to more or less radical changes in the target concept. This paper proposes the mathematical and algorithmic framework for the concept drift in the pattern recognition problems. The probabilistic basis described in this paper is based on the Bayesian approach to the estimation of decision rule parameters. The pattern recognition procedure derived from this approach uses the general principle of the dynamic programming and has linear computational complexity in contrast to polynomial computational complexity in general kind of pattern recognition procedure.

[1]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[2]  Xiao-Yong Wei,et al.  Coached active learning for interactive video search , 2011, ACM Multimedia.

[3]  Godfried T. Toussaint,et al.  Some new algorithms and software implementation methods for pattern recognition research , 1979, COMPSAC.

[4]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[5]  Ethem Alpaydin,et al.  Introduction to machine learning , 2004, Adaptive computation and machine learning.

[6]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[7]  Anastasios Tefas,et al.  Using Support Vector Machines to Enhance the Performance of Elastic Graph Matching for Frontal Face Authentication , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Jose B. Cruz,et al.  Guaranteed recall of all training pairs for bidirectional associative memory , 1991, IEEE Trans. Neural Networks.

[9]  Ian T. Jolliffe,et al.  Principal Component Analysis , 2002, International Encyclopedia of Statistical Science.

[10]  T. Wang,et al.  Weighted learning of bidirectional associative memories by global minimization , 1992, IEEE Trans. Neural Networks.

[11]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[12]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[13]  Mahmood Amiri,et al.  BAM Learning of Nonlinearly Separable Tasks by Using an Asymmetrical Output Function and Reinforcement Learning , 2009, IEEE Transactions on Neural Networks.

[14]  Xinhua Zhuang,et al.  Designing Bidirectional Associative Memories with Optimal Stability , 1994, IEEE Trans. Syst. Man Cybern. Syst..

[15]  S. Kung,et al.  Neural networks for extracting unsymmetric principal components , 1991, Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop.

[16]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[17]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[18]  Irwin King,et al.  A study of the relationship between support vector machine and Gabriel graph , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[19]  H. Hotelling Analysis of a complex of statistical variables into principal components. , 1933 .

[20]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[21]  András Kocsor,et al.  Margin Maximizing Discriminant Analysis , 2004, ECML.

[22]  Tony R. Martinez,et al.  Reduction Techniques for Instance-Based Learning Algorithms , 2000, Machine Learning.

[23]  Dennis L. Wilson,et al.  Asymptotic Properties of Nearest Neighbor Rules Using Edited Data , 1972, IEEE Trans. Syst. Man Cybern..

[24]  Konstantinos N. Plataniotis,et al.  Face recognition using kernel direct discriminant analysis algorithms , 2003, IEEE Trans. Neural Networks.

[25]  Xian-Sheng Hua,et al.  Interactive Image Search by Color Map , 2011, TIST.

[26]  Anastasios Tefas,et al.  Minimum Class Variance Support Vector Machines , 2007, IEEE Transactions on Image Processing.