Semi-Supervised Learning Using Random Walk Limiting Probabilities

The semi-supervised learning paradigm allows that a large amount of unlabeled data be classified using just a few labeled data. To account for the minimal a priori label knowledge, the information provided by the unlabeled data is also used in the classification process. This paper describes a semi-supervised technique that uses random walk limiting probabilities to propagate label information. Each label is propagated through a network of unlabeled instances via a biased random walk. The probability of a vertex receiving a label is expressed in terms of the limiting conditions of the walk process. Simulations show that the proposed technique is competitive with benchmarked techniques.

[1]  Bernhard Schölkopf,et al.  Cluster Kernels for Semi-Supervised Learning , 2002, NIPS.

[2]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[3]  Mikhail Belkin,et al.  Beyond the point cloud: from transductive to semi-supervised learning , 2005, ICML.

[4]  Marcello Pelillo,et al.  Content-based image retrieval with relevance feedback using random walks , 2011, Pattern Recognit..

[5]  Masatsugu Kidode,et al.  A Random Walk Procedure for Texture Discrimination , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Joachim Schöll,et al.  Classification by restricted random walks , 2003, Pattern Recognit..

[7]  Zoubin Ghahramani,et al.  Learning from labeled and unlabeled data with label propagation , 2002 .

[8]  Bernhard Schölkopf,et al.  Learning from Labeled and Unlabeled Data Using Random Walks , 2004, DAGM-Symposium.

[9]  Thorsten Joachims,et al.  Transductive Learning via Spectral Graph Partitioning , 2003, ICML.

[10]  Leo Grady,et al.  Random Walks for Image Segmentation , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  L. Breuer Introduction to Stochastic Processes , 2022, Statistical Methods for Climate Scientists.

[12]  Robert G. Gallager Introduction and Probability Review , 1996 .

[13]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[14]  Adrian Corduneanu,et al.  Data-Dependent Regularization , 2006, Semi-Supervised Learning.

[15]  Alexander Zien,et al.  Semi-Supervised Classification by Low Density Separation , 2005, AISTATS.

[16]  Fei Wang,et al.  Label Propagation through Linear Neighborhoods , 2008, IEEE Trans. Knowl. Data Eng..

[17]  Alexander Zien,et al.  Semi-Supervised Learning , 2006 .

[18]  Thomas G. Dietterich Adaptive computation and machine learning , 1998 .

[19]  John C. Platt,et al.  Semi-Supervised Learning with Conditional Harmonic Mixing , 2006, Semi-Supervised Learning.

[20]  Harry Wechsler,et al.  Feature extraction for texture classification , 1980, Pattern Recognit..

[21]  Stephen P. Boyd,et al.  The Fastest Mixing Markov Process on a Graph and a Connection to a Maximum Variance Unfolding Problem , 2006, SIAM Rev..

[22]  Tommi S. Jaakkola,et al.  Partially labeled classification with Markov random walks , 2001, NIPS.