暂无分享,去创建一个
[1] Nan Hua,et al. Universal Sentence Encoder , 2018, ArXiv.
[2] Matteo Pagliardini,et al. Unsupervised Learning of Sentence Embeddings Using Compositional n-Gram Features , 2017, NAACL.
[3] Iryna Gurevych,et al. Concatenated Power Mean Word Embeddings as Universal Cross-Lingual Sentence Representations , 2018, 1803.01400.
[4] Bo Pang,et al. Seeing Stars: Exploiting Class Relationships for Sentiment Categorization with Respect to Rating Scales , 2005, ACL.
[5] M. Marelli,et al. SemEval-2014 Task 1: Evaluation of Compositional Distributional Semantic Models on Full Sentences through Semantic Relatedness and Textual Entailment , 2014, *SEMEVAL.
[6] Kevin Gimpel,et al. Towards Universal Paraphrastic Sentence Embeddings , 2015, ICLR.
[7] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[8] Marco Idiart,et al. Matrix Factorization using Window Sampling and Negative Sampling for Improved Word Representations , 2016, ACL.
[9] Sanja Fidler,et al. Skip-Thought Vectors , 2015, NIPS.
[10] Karen Spärck Jones. A statistical interpretation of term specificity and its application in retrieval , 2021, J. Documentation.
[11] Ray Kurzweil,et al. Learning Semantic Textual Similarity from Conversations , 2018, Rep4NLP@ACL.
[12] Kevin Gimpel,et al. From Paraphrase Database to Compositional Paraphrase Model and Back , 2015, Transactions of the Association for Computational Linguistics.
[13] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[14] Ellen M. Voorhees,et al. Overview of the TREC 2004 Novelty Track. , 2005 .
[15] Sanjeev Arora,et al. A Simple but Tough-to-Beat Baseline for Sentence Embeddings , 2017, ICLR.
[16] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[17] Chris Quirk,et al. Unsupervised Construction of Large Paraphrase Corpora: Exploiting Massively Parallel News Sources , 2004, COLING.
[18] Kevin Gimpel,et al. Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings , 2017, ACL.
[19] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[20] Mikhail Khodak,et al. A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors , 2018, ACL.
[21] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[22] Eneko Agirre,et al. SemEval-2017 Task 1: Semantic Textual Similarity Multilingual and Crosslingual Focused Evaluation , 2017, *SEMEVAL.
[23] Bo Pang,et al. A Sentimental Education: Sentiment Analysis Using Subjectivity Summarization Based on Minimum Cuts , 2004, ACL.
[24] Ewan Klein,et al. Natural Language Processing with Python , 2009 .
[25] Kevin Gimpel,et al. Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations , 2017, ArXiv.
[26] Christopher Joseph Pal,et al. Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning , 2018, ICLR.
[27] Felix Hill,et al. Learning Distributed Representations of Sentences from Unlabelled Data , 2016, NAACL.
[28] Honglak Lee,et al. An efficient framework for learning sentence representations , 2018, ICLR.
[29] Kawin Ethayarajh,et al. Unsupervised Random Walk Sentence Embeddings: A Strong but Simple Baseline , 2018, Rep4NLP@ACL.
[30] Bing Liu,et al. Mining and summarizing customer reviews , 2004, KDD.
[31] Christopher Potts,et al. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank , 2013, EMNLP.
[32] Alessandro Moschitti,et al. KeLP at SemEval-2017 Task 3: Learning Pairwise Patterns in Community Question Answering , 2017, *SEMEVAL.
[33] Delphine Charlet,et al. SimBow at SemEval-2017 Task 3: Soft-Cosine Semantic Similarity between Questions for Community Question Answering , 2017, *SEMEVAL.
[34] Holger Schwenk,et al. Supervised Learning of Universal Sentence Representations from Natural Language Inference Data , 2017, EMNLP.