k-NN Regression Adapts to Local Intrinsic Dimension
暂无分享,去创建一个
[1] Vladimir Vapnik,et al. Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .
[2] Trevor Darrell,et al. Nearest-Neighbor Searching and Metric Space Dimensions , 2006 .
[3] Adam Krzyzak,et al. A Distribution-Free Theory of Nonparametric Regression , 2002, Springer series in statistics.
[4] Samory Kpotufe,et al. Fast, smooth and adaptive regression in metric spaces , 2009, NIPS.
[6] C. J. Stone,et al. Optimal Rates of Convergence for Nonparametric Estimators , 1980 .
[7] R. Cao-Abad,et al. Rate of Convergence for the Wild Bootstrap in Nonparametric Regression , 1991 .
[8] E. Saksman,et al. Every complete doubling metric space carries a doubling measure , 1998 .
[9] Samory Kpotufe,et al. Escaping the Curse of Dimensionality with a Tree-based Regressor , 2009, COLT.
[10] H. Fédérer. Geometric Measure Theory , 1969 .
[11] C. J. Stone,et al. OPTIMAL GLOBAL RATES OF CONVERGENCE FOR NONPARAMETRIC ESTIMATORS , 1982 .
[12] J. Staniswalis. Local Bandwidth Selection for Kernel Estimates , 1989 .
[13] P. Bickel,et al. Local polynomial regression on unknown manifolds , 2007, 0708.0983.
[14] C. D. Cutler,et al. A REVIEW OF THE THEORY AND ESTIMATION OF FRACTAL DIMENSION , 1993 .
[15] I. Holopainen. Riemannian Geometry , 1927, Nature.
[16] Sanjeev R. Kulkarni,et al. Rates of convergence of nearest neighbor estimation under arbitrary sampling , 1995, IEEE Trans. Inf. Theory.
[17] Mikhail Belkin,et al. Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.
[18] S T Roweis,et al. Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.
[19] J. Tenenbaum,et al. A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.