Learning structural correspondences across different linguistic domains with synchronous neural language models
暂无分享,去创建一个
[1] Robert L. Mercer,et al. Class-Based n-gram Models of Natural Language , 1992, CL.
[2] Koby Crammer,et al. Analysis of Representations for Domain Adaptation , 2006, NIPS.
[3] Jason Weston,et al. Curriculum learning , 2009, ICML '09.
[4] Pascal Vincent,et al. The Difficulty of Training Deep Architectures and the Effect of Unsupervised Pre-Training , 2009, AISTATS.
[5] Geoffrey E. Hinton,et al. Three new graphical models for statistical language modelling , 2007, ICML '07.
[6] Tong Zhang,et al. A Framework for Learning Predictive Structures from Multiple Tasks and Unlabeled Data , 2005, J. Mach. Learn. Res..
[7] John Blitzer,et al. Domain Adaptation with Structural Correspondence Learning , 2006, EMNLP.
[8] Yoshua Bengio,et al. Word Representations: A Simple and General Method for Semi-Supervised Learning , 2010, ACL.
[9] Jason Weston,et al. Natural Language Processing (Almost) from Scratch , 2011, J. Mach. Learn. Res..
[10] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..