Pseudo-Labeling Using Gaussian Process for Semi-Supervised Deep Learning
暂无分享,去创建一个
Ho-Jin Choi | ByungSoo Ko | Zhun Li | Ho‐Jin Choi | ByungSoo Ko | Zhun Li
[1] Hossein Mobahi,et al. Deep Learning via Semi-supervised Embedding , 2012, Neural Networks: Tricks of the Trade.
[2] Yoshua Bengio,et al. Gradient-based learning applied to document recognition , 1998, Proc. IEEE.
[3] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[4] Tolga Tasdizen,et al. Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning , 2016, NIPS.
[5] Pascal Vincent,et al. The Difficulty of Training Deep Architectures and the Effect of Unsupervised Pre-Training , 2009, AISTATS.
[6] Radford M. Neal. Pattern Recognition and Machine Learning , 2007, Technometrics.
[7] Marc'Aurelio Ranzato,et al. Semi-supervised learning of compact document representations with deep networks , 2008, ICML '08.
[8] Thomas Hofmann,et al. Greedy Layer-Wise Training of Deep Networks , 2007 .
[9] Yann LeCun,et al. What is the best multi-stage architecture for object recognition? , 2009, 2009 IEEE 12th International Conference on Computer Vision.
[10] Mubarak Shah,et al. UCF101: A Dataset of 101 Human Actions Classes From Videos in The Wild , 2012, ArXiv.
[11] Dong-Hyun Lee,et al. Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks , 2013 .
[12] Carl E. Rasmussen,et al. Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.
[13] Simon Haykin,et al. GradientBased Learning Applied to Document Recognition , 2001 .
[14] Geoffrey E. Hinton,et al. Reducing the Dimensionality of Data with Neural Networks , 2006, Science.
[15] Tolga Tasdizen,et al. Mutual exclusivity loss for semi-supervised deep learning , 2016, 2016 IEEE International Conference on Image Processing (ICIP).
[16] Peter Glöckner,et al. Why Does Unsupervised Pre-training Help Deep Learning? , 2013 .
[17] Tapani Raiko,et al. Semi-supervised Learning with Ladder Networks , 2015, NIPS.