Automatic Adjustment of Discriminant Adaptive Nearest Neighbor

K-nearest neighbors relies on the definition of a global metric. In contrast, discriminant adaptive nearest neighbor (DANN) computes a different metric at each query point based on a local linear discriminant analysis. In this paper, we propose a technique to automatically adjust the hyper-parameters in DANN by the optimization of two quality criteria. The first one measures the quality of discrimination, while the second one maximizes the local class homogeneity. We use a Bayesian formulation to prevent over-fitting

[1]  David G. Lowe,et al.  Similarity Metric Learning for a Variable-Kernel Classifier , 1995, Neural Computation.

[2]  Jing Peng,et al.  LDA/SVM driven nearest neighbor classification , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[3]  Dimitrios Gunopulos,et al.  Large margin nearest neighbor classifiers , 2005, IEEE Transactions on Neural Networks.

[4]  Jerome H. Friedman,et al.  Flexible Metric Nearest Neighbor Classification , 1994 .

[5]  Robert Tibshirani,et al.  Discriminant Adaptive Nearest Neighbor Classification , 1995, IEEE Trans. Pattern Anal. Mach. Intell..