Estimation of Kullback–Leibler Divergence by Local Likelihood
暂无分享,去创建一个
[1] R. A. Leibler,et al. On Information and Sufficiency , 1951 .
[2] H. Akaike,et al. Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .
[3] H. Akaike. A new look at the statistical model identification , 1974 .
[4] M. Stone. An Asymptotic Equivalence of Choice of Model by Cross‐Validation and Akaike's Criterion , 1977 .
[5] B. Efron,et al. The Jackknife: The Bootstrap and Other Resampling Plans. , 1983 .
[6] A. Azzalini. A class of distributions which includes the normal ones , 1985 .
[7] B. Efron. The jackknife, the bootstrap, and other resampling plans , 1987 .
[8] Yoshua Bengio,et al. Pattern Recognition and Neural Networks , 1995 .
[9] J. B. Copas,et al. Local Likelihood Based on Kernel Censoring , 1995 .
[10] M. C. Jones,et al. Locally parametric nonparametric density estimation , 1996 .
[11] G. Kitagawa,et al. Generalised information criteria in model selection , 1996 .
[12] Shinto Eguchi,et al. A class of local likelihood methods and near‐parametric asymptotics , 1998 .
[13] On local likelihood density estimation , 2002 .
[14] Shinto Eguchi,et al. Local likelihood method: a bridge over parametric and nonparametric regression , 2003 .
[15] Tae Yoon Kim,et al. On local likelihood density estimation when the bandwidth is large , 2006 .