Analyzing the Limitations of Cross-lingual Word Embedding Mappings
暂无分享,去创建一个
Eneko Agirre | Gorka Labaka | Aitor Soroa | Mikel Artetxe | Aitor Ormazabal | Eneko Agirre | Mikel Artetxe | Aitor Soroa Etxabe | Gorka Labaka | Aitor Ormazabal
[1] Alexandros Nanopoulos,et al. On the existence of obstinate results in vector space models , 2010, SIGIR.
[2] Alexandros Nanopoulos,et al. Hubs in Space: Popular Nearest Neighbors in High-Dimensional Data , 2010, J. Mach. Learn. Res..
[3] Quoc V. Le,et al. Exploiting Similarities among Languages for Machine Translation , 2013, ArXiv.
[4] Noah A. Smith,et al. A Simple, Fast, and Effective Reparameterization of IBM Model 2 , 2013, NAACL.
[5] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[6] Georgiana Dinu,et al. Improving zero-shot learning by mitigating the hubness problem , 2014, ICLR.
[7] Christopher D. Manning,et al. Bilingual Word Representations with Monolingual Quality in Mind , 2015, VS@HLT-NAACL.
[8] Yoshua Bengio,et al. BilBOWA: Fast Bilingual Distributed Representations without Word Alignments , 2014, ICML.
[9] Hiroshi Kanayama,et al. Learning Crosslingual Word Embeddings without Bilingual Corpora , 2016, EMNLP.
[10] Marie-Francine Moens,et al. Bilingual Distributed Word Representations from Document-Aligned Comparable Data , 2015, J. Artif. Intell. Res..
[11] Eneko Agirre,et al. Learning bilingual word embeddings with (almost) no bilingual data , 2017, ACL.
[12] Meng Zhang,et al. Adversarial Training for Unsupervised Bilingual Lexicon Induction , 2017, ACL.
[13] Guillaume Lample,et al. Word Translation Without Parallel Data , 2017, ICLR.
[14] Víctor M. Sánchez-Cartagena,et al. Prompsit’s submission to WMT 2018 Parallel Corpus Filtering shared task , 2018, WMT.
[15] Ndapandula Nakashole,et al. Characterizing Departures from Linearity in Word Translation , 2018, ACL.
[16] Lior Wolf,et al. Non-Adversarial Unsupervised Word Translation , 2018, EMNLP.
[17] Eneko Agirre,et al. Generalizing and Improving Bilingual Word Embedding Mappings with a Multi-Step Framework of Linear Transformations , 2018, AAAI.
[18] Eneko Agirre,et al. A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings , 2018, ACL.
[19] Anders Søgaard,et al. On the Limitations of Unsupervised Bilingual Dictionary Induction , 2018, ACL.
[20] Graham Neubig,et al. BLISS in Non-Isometric Embedding Spaces , 2018 .
[21] Anders Søgaard,et al. A Survey of Cross-lingual Word Embedding Models , 2017, J. Artif. Intell. Res..