Laplacian Affinity Propagation for Semi-Supervised Object Classification

We solve the semi-supervised multi-class object classification problem by a graph-based learning algorithm, called Laplacian affinity propagation (LAP). The idea is to model and train both labeled and unlabeled data by constructing a local neighborhood affinity graph in a smoothness formulation of Laplacian matrix, based on graph mincuts or harmonic energy minimization. The unknown labels for unlabeled data are inferred from an optimized graph embedding procedure subject to the labeled data. Such label-to-unlabel propagation scheme can provide a closed form solution via a learning framework that is flexible for any new design. LAP integrates embedding and classifier together and gives smooth labels with respect to the underlying manifold structure formed by the training data. Object classification experiments on COIL database demonstrate the effectiveness and applicability of such algorithm.

[1]  Avrim Blum,et al.  Learning from Labeled and Unlabeled Data using Graph Mincuts , 2001, ICML.

[2]  Ronald Rosenfeld,et al.  Semi-supervised learning with graphs , 2005 .

[3]  Pietro Perona,et al.  Self-Tuning Spectral Clustering , 2004, NIPS.

[4]  Mikhail Belkin,et al.  Regularization and Semi-supervised Learning on Large Graphs , 2004, COLT.

[5]  Thorsten Joachims,et al.  Transductive Learning via Spectral Graph Partitioning , 2003, ICML.

[6]  Xinhua Zhang,et al.  Hyperparameter Learning for Graph Based Semi-supervised Learning Algorithms , 2006, NIPS.

[7]  Jiawei Han,et al.  Document clustering using locality preserving indexing , 2005, IEEE Transactions on Knowledge and Data Engineering.

[8]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[9]  Zoubin Ghahramani,et al.  Semi-supervised learning : from Gaussian fields to Gaussian processes , 2003 .

[10]  Lawrence Carin,et al.  Semi-Supervised Classification , 2004, Encyclopedia of Database Systems.

[11]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[12]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[13]  Changshui Zhang,et al.  Spectral feature analysis , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[14]  Helen C. Shen,et al.  Semi-Supervised Classification Using Linear Neighborhood Propagation , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[15]  Avinash C. Kak,et al.  PCA versus LDA , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[17]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.