An estimator of the mutual information based on a criterion for independence
暂无分享,去创建一个
[1] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[2] Igor Vajda,et al. Entropy expressions for multivariate continuous distributions , 2000, IEEE Trans. Inf. Theory.
[3] Georges A. Darbellay,et al. Forecasting the short-term demand for electricity: Do neural networks stand a better chance? , 2000 .
[4] Igor Vajda,et al. Estimation of the Information by an Adaptive Partitioning of the Observation Space , 1999, IEEE Trans. Inf. Theory.
[5] A. Prochazka,et al. Signal Analysis and Prediction , 1998 .
[6] Georges A. Darbellay,et al. Predictability: An Information-Theoretic Perspective , 1998 .
[7] L. Györfi,et al. Nonparametric entropy estimation. An overview , 1997 .
[8] Pierre L'Ecuyer,et al. Combined Multiple Recursive Random Number Generators , 1995, Oper. Res..
[9] C. Granger,et al. Modelling Nonlinear Economic Relationships , 1995 .
[10] Fraser,et al. Independent coordinates for strange attractors from mutual information. , 1986, Physical review. A, General physics.
[11] I. D. Hill,et al. An Efficient and Portable Pseudo‐Random Number Generator , 1982 .
[12] H. Lilliefors. On the Kolmogorov-Smirnov Test for Normality with Mean and Variance Unknown , 1967 .
[13] G. W. Snedecor. STATISTICAL METHODS , 1967 .
[14] A. Rényi. On measures of dependence , 1959 .
[15] Claude E. Shannon,et al. The mathematical theory of communication , 1950 .