Iterative Nearest Neighbors for classification and dimensionality reduction

Representative data in terms of a set of selected samples is of interest for various machine learning applications, e.g. dimensionality reduction and classification. The best-known techniques probably still are k-Nearest Neighbors (kNN) and its variants. Recently, richer representations have become popular. Examples are methods based on l1-regularized least squares (Sparse Representation (SR)) or l2-regularized least squares (Collaborative Representation (CR)), or on l1-constrained least squares (Local Linear Embedding (LLE)). We propose Iterative Nearest Neighbors (INN). This is a novel sparse representation that combines the power of SR and LLE with the computational simplicity of kNN. We test our method in terms of dimensionality reduction and classification, using standard benchmarks such as faces (AR), traffic signs (GTSRB), and PASCAL VOC 2007. INN performs better than NN and comparable with CR and SR, while being orders of magnitude faster than the latter.

[1]  Lei Zhang,et al.  Sparse representation or collaborative representation: Which helps face recognition? , 2011, 2011 International Conference on Computer Vision.

[2]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[3]  Yihong Gong,et al.  Linear spatial pyramid matching using sparse coding for image classification , 2009, CVPR.

[4]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[5]  Allen Y. Yang,et al.  A Review of Fast l1-Minimization Algorithms for Robust Face Recognition , 2010, ArXiv.

[6]  Rajat Raina,et al.  Efficient sparse coding algorithms , 2006, NIPS.

[7]  Jianguo Zhang,et al.  The PASCAL Visual Object Classes Challenge , 2006 .

[8]  Yihong Gong,et al.  Locality-constrained Linear Coding for image classification , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Allen Y. Yang,et al.  Robust Face Recognition via Sparse Representation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Avinash C. Kak,et al.  PCA versus LDA , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Luc Van Gool,et al.  Sparse Representation Based Projections , 2011, BMVC.

[12]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[13]  Garrison W. Cottrell,et al.  Robust classification of objects, faces, and flowers using natural image statistics , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  M. Salman Asif Primal dual pursuit: a homotopy based algorithm for the Dantzig selector , 2008 .

[15]  Dieter Fox,et al.  Kernel Descriptors for Visual Recognition , 2010, NIPS.

[16]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .

[17]  Stephen P. Boyd,et al.  An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares , 2007, IEEE Journal of Selected Topics in Signal Processing.

[18]  Le Li,et al.  SENSC: a Stable and Efficient Algorithm for Nonnegative Sparse Coding: SENSC: a Stable and Efficient Algorithm for Nonnegative Sparse Coding , 2009 .

[19]  Eli Shechtman,et al.  In defense of Nearest-Neighbor based image classification , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Trevor Darrell,et al.  The NBNN kernel , 2011, 2011 International Conference on Computer Vision.

[21]  Yaakov Tsaig,et al.  Fast Solution of $\ell _{1}$ -Norm Minimization Problems When the Solution May Be Sparse , 2008, IEEE Transactions on Information Theory.

[22]  R. Fisher THE STATISTICAL UTILIZATION OF MULTIPLE MEASUREMENTS , 1938 .

[23]  Liang-Tien Chia,et al.  Local features are not lonely – Laplacian sparse coding for image classification , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.