Parallel Tempering for Training of Restricted Boltzmann Machines
暂无分享,去创建一个
Pascal Vincent | Yoshua Bengio | Aaron C. Courville | Aaron Courville | Guillaume Desjardins | Olivier Delalleau | Yoshua Bengio | Olivier Delalleau | Pascal Vincent | Guillaume Desjardins
[1] Geoffrey E. Hinton,et al. Using fast weights to improve persistent contrastive divergence , 2009, ICML '09.
[2] Radford M. Neal. Sampling from multimodal distributions using tempered transitions , 1996, Stat. Comput..
[3] L. Younes. On the convergence of markovian stochastic algorithms with rapidly decreasing ergodicity rates , 1999 .
[4] Geoffrey E. Hinton,et al. Exponential Family Harmoniums with an Application to Information Retrieval , 2004, NIPS.
[5] Yee Whye Teh,et al. A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.
[6] Yoshua Bengio,et al. Justifying and Generalizing Contrastive Divergence , 2009, Neural Computation.
[7] Ruslan Salakhutdinov,et al. Learning in Markov Random Fields using Tempered Transitions , 2009, NIPS.
[8] Geoffrey E. Hinton. Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.
[9] Paul Smolensky,et al. Information processing in dynamical systems: foundations of harmony theory , 1986 .
[10] Nicolas Le Roux,et al. Representational Power of Restricted Boltzmann Machines and Deep Belief Networks , 2008, Neural Computation.
[11] Yoshua. Bengio,et al. Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..
[12] Yukito Iba. EXTENDED ENSEMBLE MONTE CARLO , 2001 .
[13] Geoffrey E. Hinton. Products of experts , 1999 .
[14] David Haussler,et al. Unsupervised learning of distributions on binary vectors using two layer networks , 1991, NIPS 1991.
[15] Tijmen Tieleman,et al. Training restricted Boltzmann machines using approximations to the likelihood gradient , 2008, ICML '08.
[16] Miguel Á. Carreira-Perpiñán,et al. On Contrastive Divergence Learning , 2005, AISTATS.