Nonparametric Estimation of Conditional Information and Divergences
暂无分享,去创建一个
[1] Alfred O. Hero,et al. Asymptotic theory of greedy approximations to minimal k-point random graphs , 1999, IEEE Trans. Inf. Theory.
[2] R. Matthews. Storks Deliver Babies (p= 0.008) , 2000 .
[3] Barnabás Póczos,et al. Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs , 2010, NIPS.
[4] William Feller,et al. An Introduction to Probability Theory and Its Applications , 1967 .
[5] Martin J. Wainwright,et al. Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization , 2008, IEEE Transactions on Information Theory.
[6] Barnabás Póczos,et al. REGO: Rank-based Estimation of Renyi Information using Euclidean Graph Optimization , 2010, AISTATS.
[7] C. Quesenberry,et al. A nonparametric estimate of a multivariate density function , 1965 .
[8] M. N. Goria,et al. A new class of random vector entropy estimators and its applications in testing statistical hypotheses , 2005 .
[9] D. W. Scott,et al. Multivariate Density Estimation, Theory, Practice and Visualization , 1992 .
[10] Margaret J. Robertson,et al. Design and Analysis of Experiments , 2006, Handbook of statistics.
[11] Martin J. Wainwright,et al. ON surrogate loss functions and f-divergences , 2005, math/0510521.
[12] Alexander J. Smola,et al. The kernel mutual information , 2003, 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03)..
[13] Sanjeev R. Kulkarni,et al. Universal Estimation of Information Measures for Analog Sources , 2009, Found. Trends Commun. Inf. Theory.
[14] J. Rombouts,et al. Nonparametric Copula-Based Test for Conditional Independence with Applications to Granger Causality , 2012 .
[15] H. White,et al. A NONPARAMETRIC HELLINGER METRIC TEST FOR CONDITIONAL INDEPENDENCE , 2008, Econometric Theory.
[16] D. Edwards. Introduction to graphical modelling , 1995 .
[17] J. Pearl. Why there is no statistical test for confounding, why many think there is, and why they are almost right , 1998 .
[18] Bernhard Schölkopf,et al. Kernel Measures of Conditional Dependence , 2007, NIPS.
[19] Alfred O. Hero,et al. Applications of entropic spanning graphs , 2002, IEEE Signal Process. Mag..
[20] B. Schweizer,et al. On Nonparametric Measures of Dependence for Random Variables , 1981 .
[21] Qing Wang,et al. Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances , 2009, IEEE Transactions on Information Theory.
[22] Maria L. Rizzo,et al. Measuring and testing dependence by correlation of distances , 2007, 0803.4101.
[23] P. A. P. Moran,et al. An introduction to probability theory , 1968 .
[24] A. Hero,et al. Empirical estimation of entropy functionals with confidence , 2010, 1012.4188.
[25] Nir Friedman,et al. Probabilistic Graphical Models - Principles and Techniques , 2009 .
[26] J. Steele. Probability theory and combinatorial optimization , 1987 .
[27] Barnabás Póczos,et al. On the Estimation of alpha-Divergences , 2011, AISTATS.
[28] Tom Burr,et al. Causation, Prediction, and Search , 2003, Technometrics.
[29] Bernhard Schölkopf,et al. Kernel-based Conditional Independence Test and Application in Causal Discovery , 2011, UAI.
[30] Maya R. Gupta,et al. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy , 2010, Entropy.
[31] Fernando Pérez-Cruz,et al. Estimation of Information Theoretic Measures for Continuous Random Variables , 2008, NIPS.
[32] J. Yukich. Probability theory of classical Euclidean optimization problems , 1998 .