Bias of Nearest Neighbor Error Estimates

The bias of the finite-sample nearest neighbor (NN) error from its asymptotic value is examined. Expressions are obtained which relate the bias of the NN and 2-NN errors to sample size, dimensionality, metric, and distributions. These expressions isolate the effect of sample size from that of the distributions, giving an explicit relation showing how the bias changes as the sample size is increased. Experimental results are given which suggest that the expressions accurately predict the bias. It is shown that when the dimensionality of the data is high, it may not be possible to estimate the asymptotic error simply by increasing the sample size. A new procedure is suggested to alleviate this problem. This procedure involves measuring the mean NN errors at several sample sizes and using our derived relationship between the bias and the sample size to extrapolate an estimate of the asymptotic NN error. The results are extended to the multiclass problem. The choice of an optimal metric to minimize the bias is also discussed.

[1]  Luc Devroye,et al.  On the Inequality of Cover and Hart in Nearest Neighbor Discrimination , 1981, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  L. Devroye,et al.  Distribution-Free Consistency Results in Nonparametric Discrimination and Regression Function Estimation , 1980 .

[3]  László Györfi,et al.  The Rate of Convergence of k ,-NN Regression Estimates and Classification Rules , 1978 .

[4]  József Fritz,et al.  Distribution-free exponential error bound for nearest neighbor pattern classification , 1975, IEEE Trans. Inf. Theory.

[5]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[6]  Terry J. Wagner,et al.  Convergence of the nearest neighbor rule , 1970, IEEE Trans. Inf. Theory.

[7]  Anil K. Jain,et al.  An Intrinsic Dimensionality Estimator from Near-Neighbor Information , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  L. Gyorfi The rate of convergence of k_n -NN regression estimates and classification rules (Corresp.) , 1981 .

[9]  D. Fraser Nonparametric methods in statistics , 1957 .

[10]  John B. Anderson Simulated error performance of multi-h phase codes , 1981, IEEE Trans. Inf. Theory.

[11]  C. J. Stone,et al.  Consistent Nonparametric Regression , 1977 .

[12]  L. Devroye On the Almost Everywhere Convergence of Nonparametric Regression Function Estimates , 1981 .

[13]  Keinosuke Fukunaga,et al.  Classification Error for a Very Large Number of Classes , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Keinosuke Fukunaga,et al.  An Optimal Global Nearest Neighbor Metric , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Keinosuke Fukunaga,et al.  The optimal distance measure for nearest neighbor classification , 1981, IEEE Trans. Inf. Theory.

[16]  Larry D. Hostetler,et al.  Optimization of k nearest neighbor density estimates , 1973, IEEE Trans. Inf. Theory.

[17]  W. Rogers,et al.  A Finite Sample Distribution-Free Performance Bound for Local Discrimination Rules , 1978 .