An Information-Theoretic Approach to Optimally Calibrate Approximate Models
暂无分享,去创建一个
[1] D. Lindley. On a Measure of the Information Provided by an Experiment , 1956 .
[2] K. Chaloner,et al. Bayesian Experimental Design: A Review , 1995 .
[3] A. Kraskov,et al. Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.
[4] W. J. Studden,et al. Theory Of Optimal Experiments , 1972 .
[5] A. O'Hagan,et al. Bayesian calibration of computer models , 2001 .
[6] S. Saigal,et al. Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.
[7] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[8] Sandia Report,et al. Statistical Validation of Engineering and Scientific Models: Bounds, Calibration, and Extrapolation , 2005 .
[9] Henry P. Wynn,et al. Maximum entropy sampling , 1987 .
[10] Shapour Azarm,et al. A SEQUENTIAL INFORMATION-THEORETIC APPROACH TO DESIGN OF COMPUTER EXPERIMENTS , 2002 .
[11] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[12] Bayesian Adaptive Exploration , 2004, astro-ph/0409386.
[13] Sai Hung Cheung,et al. New Bayesian Updating Methodology for Model Validation and Robust Predictions Based on Data from Hierarchical Subsystem Tests , 2008 .
[14] Gabriel Terejanu,et al. Bayesian experimental design for the active nitridation of graphite by atomic nitrogen , 2011, ArXiv.
[15] H. Wynn,et al. Maximum entropy sampling and optimal Bayesian experimental design , 2000 .
[16] Liam Paninski,et al. Asymptotic Theory of Information-Theoretic Experimental Design , 2005, Neural Computation.
[17] Renato Vicente,et al. An information-theoretic approach to statistical dependence: Copula information , 2009, ArXiv.