Learning Text Pair Similarity with Context-sensitive Autoencoders
暂无分享,去创建一个
Philip Resnik | Jordan L. Boyd-Graber | Jordan Boyd-Graber | Hadi Amiri | Hal Daumé | Hal Daumé | P. Resnik | H. Amiri
[1] Daniel Zwillinger,et al. CRC Standard Probability and Statistics Tables and Formulae, Student Edition , 1999 .
[2] Noah A. Smith,et al. What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA , 2007, EMNLP.
[3] Noah A. Smith,et al. Tree Edit Models for Recognizing Textual Entailments, Paraphrases, and Answers to Questions , 2010, NAACL.
[4] Chih-Jen Lin,et al. Projected Gradient Methods for Nonnegative Matrix Factorization , 2007, Neural Computation.
[5] Ming Zhou,et al. Hierarchical Recurrent Neural Network for Document Modeling , 2015, EMNLP.
[6] Carina Silberer,et al. Learning Grounded Meaning Representations with Autoencoders , 2014, ACL.
[7] Hal Daumé,et al. Deep Unordered Composition Rivals Syntactic Methods for Text Classification , 2015, ACL.
[8] Eduard H. Hovy,et al. A Model of Coherence Based on Distributed Sentence Representation , 2014, EMNLP.
[9] Michael I. Jordan,et al. Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..
[10] David Buttler,et al. Exploring Topic Coherence over Many Models and Many Topics , 2012, EMNLP.
[11] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[12] Florent Perronnin,et al. Aggregating Continuous Word Embeddings for Information Retrieval , 2013, CVSM@ACL.
[13] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[14] Yoshua Bengio,et al. A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..
[15] Pascal Vincent,et al. Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..
[16] Matthew D. Zeiler. ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.
[17] Yoshua Bengio,et al. Extracting and composing robust features with denoising autoencoders , 2008, ICML '08.
[18] Dat Quoc Nguyen,et al. Improving Topic Models with Latent Feature Word Representations , 2015, TACL.
[19] Alessandro Moschitti,et al. Automatic Feature Engineering for Answer Selection and Extraction , 2013, EMNLP.
[20] Cícero Nogueira dos Santos,et al. Learning Hybrid Representations to Retrieve Semantically Equivalent Questions , 2015, ACL.
[21] Hinrich Schütze,et al. AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes , 2015, ACL.
[22] Ming-Wei Chang,et al. Question Answering Using Enhanced Lexical Semantic Models , 2013, ACL.
[23] Lei Yu,et al. Deep Learning for Answer Sentence Selection , 2014, ArXiv.
[24] Jeffrey Dean,et al. Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.
[25] Chris Dyer,et al. Document Context Language Models , 2015, ICLR 2015.
[26] David M. Blei,et al. Syntactic Topic Models , 2008, NIPS.
[27] Andrew Y. Ng,et al. Improving Word Representations via Global Context and Multiple Word Prototypes , 2012, ACL.
[28] Alessandro Moschitti,et al. Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks , 2015, SIGIR.
[29] Pascal Vincent,et al. Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[30] Geoffrey E. Hinton,et al. Reducing the Dimensionality of Data with Neural Networks , 2006, Science.
[31] Andrew McCallum,et al. Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space , 2014, EMNLP.
[32] Chris Callison-Burch,et al. Answer Extraction as Sequence Tagging with Tree Edit Distance , 2013, NAACL.
[33] Zhiyuan Liu,et al. Topical Word Embeddings , 2015, AAAI.
[34] Quoc V. Le,et al. Distributed Representations of Sentences and Documents , 2014, ICML.
[35] Kyunghyun Cho,et al. Larger-Context Language Modelling , 2015, ArXiv.
[36] Zhiyuan Liu,et al. A Unified Model for Word Sense Representation and Disambiguation , 2014, EMNLP.
[37] Lukás Burget,et al. Recurrent neural network based language model , 2010, INTERSPEECH.
[38] Geoffrey Zweig,et al. Context dependent recurrent neural network language model , 2012, 2012 IEEE Spoken Language Technology Workshop (SLT).
[39] Jeffrey Pennington,et al. Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection , 2011, NIPS.