Optimal Combination of Feature Weight Learning and Classification Based on Local Approximation

Currently, most feature weights estimation methods are independent on the classification algorithms. The combination of discriminant analysis and classifiers for effective pattern classification remains heuristic. The present study address the topics of learning of feature weights by using a recently reported classification algorithm, K-Local Hyperplane Distance Nearest Neighbor (HKNN) [18], in which the data is modeled as embedded in a linear hyperplane. Motivated by the encouraging performance of the Learning Discriminative Projections and Prototypes, the feature weights are estimated by minimizing the classifier leave-one-out cross validation error of HKNN. Approximated explicit solution is obtained to give feature estimation. Therefore, the feature weighting and classification are perfectly matched. The performance of the combinational model is extensively assessed via experiments on both synthetic and benchmark datasets. The superior results demonstrate that the method is competitive compared with some state-of-art models.

[1]  Pascal Vincent,et al.  K-Local Hyperplane and Convex Distance Nearest Neighbor Algorithms , 2001, NIPS.

[2]  Jian Yang,et al.  From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis , 2011, Pattern Recognit..

[3]  Vojislav Kecman,et al.  Adaptive local hyperplane classification , 2008, Neurocomputing.

[4]  Yijun Sun,et al.  Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Shuicheng Yan,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007 .

[6]  Hwann-Tzong Chen,et al.  Local discriminant embedding and its variants , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[7]  Sinisa Todorovic,et al.  Local-Learning-Based Feature Selection for High-Dimensional Data Analysis , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Kun Zhou,et al.  Locality Sensitive Discriminant Analysis , 2007, IJCAI.

[9]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[10]  Roberto Paredes,et al.  Simultaneous learning of a discriminative projection and prototypes for Nearest-Neighbor classification , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[12]  G. Baudat,et al.  Generalized Discriminant Analysis Using a Kernel Approach , 2000, Neural Computation.

[13]  Yuxiao Hu,et al.  Face recognition using Laplacianfaces , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Roberto Paredes,et al.  Dimensionality reduction by minimizing nearest-neighbor classification error , 2011, Pattern Recognit. Lett..

[15]  Dapeng Wu,et al.  A RELIEF Based Feature Extraction Algorithm , 2008, SDM.

[16]  Bernhard Schölkopf,et al.  A Local Learning Approach for Clustering , 2006, NIPS.

[17]  Yiu-ming Cheung,et al.  Feature Selection and Kernel Learning for Local Learning-Based Clustering , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Masashi Sugiyama,et al.  Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis , 2007, J. Mach. Learn. Res..

[19]  Shinichi Nakajima,et al.  Semi-supervised local Fisher discriminant analysis for dimensionality reduction , 2009, Machine Learning.

[20]  Jiawei Han,et al.  Semi-supervised Discriminant Analysis , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[21]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[22]  Josef Kittler,et al.  Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Jian Yang,et al.  KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.