Locally Weighted Regression
暂无分享,去创建一个
1 NN in a subspace A common pre-processing step is to project the data into a lower-dimensional subspace, before applying k-NN estimator. One example of this is the Eigenfaces algorithm for face recognition. PCA is applied on a database of face images (aligned, of fixed dimension) to get a principal subspace (of much lower dimensionality than the original, which is the number of pixels in the image). For some fixed m this means taking the m eigenvectors U = [u 1 ,. .. , u m ] of XX T with the largest eigenvalues. Each face image is then represented by the vector of coefficients obtained by projecting it to the principal dimensions: x i = U T x i. Given a test image, its coefficient vector x 0 = U T x 0 is calculated, and classified using k-NN in this new m-dimensional representation. 2 Parametric vs nonparametric egression methods We are now focusing on the regression problem. We will assume that the observations are generated by a process y i = f (x i) + , where the noise is independent of the data, and has zero mean and variance σ 2. Two " global " approaches to regression: Parametric: assume parametric form y = f (x; θ), fit the parameters to the training set θ * = argmin θ n i=1 L(y i , f (x i ; θ)) (for instance, using least squares procedure when L is squared loss) and then estimatê y 0 = f (x 0 ; θ *) • Pros: Once trained, cheap to apply on any new data points. For many forms of f , a closed-form solution for θ * • Cons: A pretty strong assumption regarding the parametric form of f .
[1] Andrew W. Moore,et al. Locally Weighted Learning , 1997, Artificial Intelligence Review.
[2] M. Wand,et al. Multivariate Locally Weighted Least Squares Regression , 1994 .