On Consistency of Bayesian Inference with Mixtures of Logistic Regression
暂无分享,去创建一个
[1] Robert A. Jacobs,et al. Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.
[2] Geoffrey E. Hinton,et al. Adaptive Mixtures of Local Experts , 1991, Neural Computation.
[3] 中澤 真,et al. Devroye, L., Gyorfi, L. and Lugosi, G. : A Probabilistic Theory of Pattern Recognition, Springer (1996). , 1997 .
[4] T. Choi. Convergence of posterior distribution in the mixture of regressions , 2008 .
[5] Wenxin Jiang,et al. On the Approximation Rate of Hierarchical Mixtures-of-Experts for Generalized Linear Models , 1999, Neural Computation.
[6] Michael I. Jordan,et al. Convergence results for the EM approach to mixtures of experts architectures , 1995, Neural Networks.
[7] Fengchun Peng,et al. Bayesian Inference in Mixtures-of-Experts and Hierarchical Mixtures-of-Experts Models With an Applic , 1996 .
[8] László Györfi,et al. A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.
[9] Jiming Jiang,et al. Conditional inference about generalized linear mixed models , 1999 .
[10] Wenxin Jiang. On the Consistency of Bayesian Variable Selection for High Dimensional Binary Regression and Classification , 2006, Neural Computation.
[11] M. Tanner,et al. Hierarchical mixtures-of-experts for exponential family regression models: approximation and maximum , 1999 .
[12] Herbert K. H. Lee. Consistency of posterior distributions for neural networks , 2000, Neural Networks.