A fast method for prior probability selection based on maximum entropy principle and Gibbs sampler
暂无分享,去创建一个
[1] Song-Chun Zhu,et al. Prior Learning and Gibbs Reaction-Diffusion , 1997, IEEE Trans. Pattern Anal. Mach. Intell..
[2] J. Bernardo. Reference Posterior Distributions for Bayesian Inference , 1979 .
[3] Song-Chun Zhu,et al. Minimax Entropy Principle and Its Application to Texture Modeling , 1997, Neural Computation.
[4] Song-Chun Zhu. Filters, Random Fields and Maximum Entropy (FRAME): Towards a Unified Theory for Texture Modeling , 1998 .
[5] J. Bernardo,et al. An introduction to Bayesian reference analysis: inference on the ratio of multinomial parameters , 1998 .
[6] Monte Carlo Integration. Markov Chain Monte Carlo and Gibbs Sampling , 2002 .
[7] G. Potamianos,et al. Partition Function Estimation of Gibbs Random , 1993 .
[8] J. Laurie Snell,et al. Markov Random Fields and Their Applications , 1980 .
[9] Robert E. Kass,et al. Formal rules for selecting prior distributions: A review and annotated bibliography , 1993 .
[10] Donald Geman,et al. Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[11] Grahame B. Smith. Stuart Geman and Donald Geman, “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images”; , 1987 .