Semi-Supervised Regression using Spectral Techniques

Graph-based approaches for semi-supervised learning have received increasing amount of interest in recent years. Despite their good performance, many pure graph based algorithms do not have explicit functions and can not predict the label of unseen data. Graph regularization is a recently proposed framework which incorporates the intrinsic geometrical structure as a regularization term. It can be performed as semi-supervised learning when unlabeled samples are available. However, our theoretical analysis shows that such approach may not be optimal for multi-class problems. In this paper, we propose a novel method, called Spectral Regression (SR). By using spectral techniques, we first compute a set of responses for each sample which respects both the label information and geometrical structure. Once the responses are obtained, the ordinary ridge regression can be apply to find the regression functions. Our proposed algorithm is particularly designed for multi-class problem. Experimental results on two real world classification problems arising in visual and speech recognition demonstrate the effectiveness of our algorithm.

[1]  Andrew B. Kahng,et al.  Spectral Partitioning with Multiple Eigenvectors , 1999, Discret. Appl. Math..

[2]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[3]  Tong Zhang,et al.  Text Categorization Based on Regularized Linear Classification Methods , 2001, Information Retrieval.

[4]  Mikhail Belkin,et al.  Using manifold structure for partially labelled classification , 2002, NIPS 2002.

[5]  Thorsten Joachims,et al.  Transductive Learning via Spectral Graph Partitioning , 2003, ICML.

[6]  Yiming Yang,et al.  A Loss Function Analysis for Classification Methods in Text Categorization , 2003, ICML.

[7]  Michael A. Saunders,et al.  LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares , 1982, TOMS.

[8]  Tong Zhang,et al.  Analysis of Spectral Kernel Design based Semi-supervised Learning , 2005, NIPS.

[9]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[10]  Nicolas Le Roux,et al.  Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering , 2003, NIPS.

[11]  R. Tibshirani,et al.  Penalized Discriminant Analysis , 1995 .

[12]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[13]  Yuan Qi,et al.  Hyperparameter and Kernel Learning for Graph Based Semi-Supervised Classification , 2005, NIPS.

[14]  Fan Chung,et al.  Spectral Graph Theory , 1996 .

[15]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[16]  Avrim Blum,et al.  The Bottleneck , 2021, Monopsony Capitalism.

[17]  Yair Weiss,et al.  Segmentation using eigenvectors: a unifying view , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[18]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[19]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[20]  Avrim Blum,et al.  Learning from Labeled and Unlabeled Data using Graph Mincuts , 2001, ICML.

[21]  Bernhard Schölkopf,et al.  Cluster Kernels for Semi-Supervised Learning , 2002, NIPS.

[22]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[23]  Vikas Sindhwani,et al.  On Manifold Regularization , 2005, AISTATS.

[24]  A. N. Tikhonov,et al.  REGULARIZATION OF INCORRECTLY POSED PROBLEMS , 1963 .

[25]  F. Chung Spectral Graph Theory, Regional Conference Series in Math. , 1997 .