MDL/Bayesian Criteria Based on Universal Coding/Measure
暂无分享,去创建一个
[1] C. S. Wallace,et al. Statistical and Inductive Inference by Minimum Message Length (Information Science and Statistics) , 2005 .
[2] Ray J. Solomonoff,et al. A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..
[3] Joe Suzuki,et al. On Strong Consistency of Model Selection in Classification , 2006, IEEE Transactions on Information Theory.
[4] David L. Dowe,et al. Foreword re C. S. Wallace , 2008, Comput. J..
[5] Joe Suzuki,et al. A Construction of Bayesian Networks from Databases Based on an MDL Principle , 1993, UAI.
[6] Raphail E. Krichevsky,et al. The performance of universal encoding , 1981, IEEE Trans. Inf. Theory.
[7] A. Barron. THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .
[8] R. A. Leibler,et al. On Information and Sufficiency , 1951 .
[9] Wray L. Buntine,et al. Learning classification trees , 1992 .
[10] Joe Suzuki. The Universal Measure for General Sources and Its Application to MDL/Bayesian Criteria , 2011, 2011 Data Compression Conference.
[11] David L. Dowe,et al. MML, hybrid Bayesian network graphical models, statistical consistency, invarianc , 2010 .
[12] Boris Ryabko,et al. Compression-Based Methods for Nonparametric Prediction and Estimation of Some Characteristics of Time Series , 2009, IEEE Transactions on Information Theory.
[13] Gregory F. Cooper,et al. A Bayesian method for the induction of probabilistic networks from data , 1992, Machine Learning.
[14] C. S. Wallace,et al. An Information Measure for Classification , 1968, Comput. J..
[15] J. Rissanen,et al. Modeling By Shortest Data Description* , 1978, Autom..
[16] Thomas M. Cover,et al. Elements of Information Theory , 2005 .