Leave-one-out manifold regularization

The manifold regularization (MR) based semi-supervised learning could explore structural relationships from both labeled and unlabeled data. However, the model selection of MR seriously affects its predictive performance due to the inherent additional geometry regularizer of labeled and unlabeled data. In this paper, two continuous and two inherent discrete hyperparameters are selected as optimization variables, and a leave-one-out cross-validation (LOOCV) based Predicted REsidual Sum of Squares (PRESS) criterion is first presented for model selection of MR to choose appropriate regularization coefficients and kernel parameters. Considering the inherent discontinuity of the two hyperparameters, the minimization process is implemented by using a improved Nelder-Mead simplex algorithm to solve the inherent discrete and continues hybrid variables set. The manifold regularization and model selection algorithm are applied to six synthetic and real-life benchmark dataset. The proposed approach, leveraged by effectively exploiting the embedded intrinsic geometric manifolds and unbiased LOOCV estimation, outperforms the original MR and supervised learning approaches in the empirical study.

[1]  Alexander Ypma,et al.  Learning methods for machine vibration analysis and health monitoring , 2001 .

[2]  Sebastian Thrun,et al.  Text Classification from Labeled and Unlabeled Documents using EM , 2000, Machine Learning.

[3]  Mikhail Belkin,et al.  Beyond the point cloud: from transductive to semi-supervised learning , 2005, ICML.

[4]  Mikhail Belkin,et al.  A Co-Regularization Approach to Semi-supervised Learning with Multiple Views , 2005 .

[5]  Xiaojin Zhu,et al.  --1 CONTENTS , 2006 .

[6]  Andrew Blake,et al.  Sparse and Semi-supervised Visual Mapping with the S^3GP , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[7]  Vikas Sindhwani,et al.  On semi-supervised kernel methods , 2007 .

[8]  Jeffrey C. Lagarias,et al.  Convergence Properties of the Nelder-Mead Simplex Method in Low Dimensions , 1998, SIAM J. Optim..

[9]  Gavin C. Cawley,et al.  Fast exact leave-one-out cross-validation of sparse least-squares support vector machines , 2004, Neural Networks.

[10]  Tao Yu,et al.  Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression , 2009, Expert Syst. Appl..

[11]  Tao Yu,et al.  Reliable multi-objective optimization of high-speed WEDM process based on Gaussian process regression , 2008 .

[12]  Bernhard Schölkopf,et al.  Learning with kernels , 2001 .

[13]  David M. Allen,et al.  The Relationship Between Variable Selection and Data Agumentation and a Method for Prediction , 1974 .

[14]  Gavin C. Cawley,et al.  Leave-One-Out Cross-Validation Based Model Selection Criteria for Weighted LS-SVMs , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[15]  Li Wei,et al.  Semi-supervised time series classification , 2006, KDD '06.

[16]  Abderrahim Elmoataz,et al.  Nonlocal Discrete Regularization on Weighted Graphs: A Framework for Image and Manifold Processing , 2008, IEEE Transactions on Image Processing.

[17]  Rayid Ghani,et al.  Analyzing the effectiveness and applicability of co-training , 2000, CIKM '00.

[18]  Qiang Yang,et al.  A Manifold Regularization Approach to Calibration Reduction for Sensor-Network Based Tracking , 2006, AAAI.

[19]  Alexander Zien,et al.  Semi-Supervised Classification by Low Density Separation , 2005, AISTATS.

[20]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[21]  Liefeng Bo,et al.  Feature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation , 2006 .