暂无分享,去创建一个
Ewan Dunbar | R. Thomas McCoy | Tal Linzen | Paul Smolensky | P. Smolensky | Tal Linzen | Ewan Dunbar
[1] Sanja Fidler,et al. Skip-Thought Vectors , 2015, NIPS.
[2] Shujian Huang,et al. Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder , 2017, ACL.
[3] Allen Newell,et al. Physical Symbol Systems , 1980, Cogn. Sci..
[4] Eran Yahav,et al. Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples , 2017, ICML.
[5] Ellie Pavlick,et al. Compositional Lexical Semantics in Natural Language Inference , 2017 .
[6] Jürgen Schmidhuber,et al. Learning to Reason with Third-Order Tensor Products , 2018, NeurIPS.
[7] Dawn Xiaodong Song,et al. Tree-to-tree Neural Networks for Program Translation , 2018, NeurIPS.
[8] Virginia R. de Sa,et al. Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations , 2018, ArXiv.
[9] Li Deng,et al. Question-Answering with Grammatically-Interpretable Representations , 2017, AAAI.
[10] Richard Socher,et al. Pointer Sentinel Mixture Models , 2016, ICLR.
[11] Krystian Mikolajczyk,et al. Higher-Order Occurrence Pooling for Bags-of-Words: Visual Concept Detection , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[12] Geoffrey E. Hinton. Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems , 1991 .
[13] Yonatan Belinkov,et al. What do Neural Machine Translation Models Learn about Morphology? , 2017, ACL.
[14] Christopher D. Manning,et al. Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks , 2010 .
[15] Xing Shi,et al. Does String-Based Neural MT Learn Source Syntax? , 2016, EMNLP.
[16] Sean M. Polyn,et al. Beyond mind-reading: multi-voxel pattern analysis of fMRI data , 2006, Trends in Cognitive Sciences.
[17] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[18] Chris Quirk,et al. Unsupervised Construction of Large Paraphrase Corpora: Exploiting Massively Parallel News Sources , 2004, COLING.
[19] Christopher Potts,et al. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank , 2013, EMNLP.
[20] Tony A. Plate,et al. Holographic reduced representations , 1995, IEEE Trans. Neural Networks.
[21] Li Deng,et al. Tensor Product Generation Networks for Deep NLP Modeling , 2017, NAACL.
[22] Douwe Kiela,et al. SentEval: An Evaluation Toolkit for Universal Sentence Representations , 2018, LREC.
[23] Jakob Uszkoreit,et al. A Decomposable Attention Model for Natural Language Inference , 2016, EMNLP.
[24] Wayne A. Wickelgran. Context-sensitive coding, associative memory, and serial order in (speech) behavior. , 1969 .
[25] Anima Anandkumar,et al. Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..
[26] Jordan B. Pollack,et al. Recursive Distributed Representations , 1990, Artif. Intell..
[27] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[28] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[29] Eneko Agirre,et al. SemEval-2017 Task 1: Semantic Textual Similarity Multilingual and Crosslingual Focused Evaluation , 2017, *SEMEVAL.
[30] P. Smolensky. Symbolic functions from neural computation , 2012, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[31] Yonatan Belinkov,et al. On the Evaluation of Semantic Phenomena in Neural Machine Translation Using Natural Language Inference , 2018, NAACL.
[32] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[33] Demetri Terzopoulos,et al. Multilinear image analysis for facial recognition , 2002, Object recognition supported by user interaction for service robots.
[34] Richard Montague,et al. ENGLISH AS A FORMAL LANGUAGE , 1975 .
[35] Christopher Potts,et al. A Fast Unified Model for Parsing and Sentence Understanding , 2016, ACL.
[36] L Sirovich,et al. Low-dimensional procedure for the characterization of human faces. , 1987, Journal of the Optical Society of America. A, Optics and image science.
[37] C. Lee Giles,et al. Extraction of rules from discrete-time recurrent neural networks , 1996, Neural Networks.
[38] Allyson Ettinger,et al. Assessing Composition in Sentence Vector Representations , 2018, COLING.
[39] Razvan Pascanu,et al. Relational inductive biases, deep learning, and graph networks , 2018, ArXiv.
[40] Joshua B. Tenenbaum,et al. Separating Style and Content with Bilinear Models , 2000, Neural Computation.
[41] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[42] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[43] Geoffrey Zweig,et al. Linguistic Regularities in Continuous Space Word Representations , 2013, NAACL.
[44] M. Turk,et al. Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.
[45] Holger Schwenk,et al. Supervised Learning of Universal Sentence Representations from Natural Language Inference Data , 2017, EMNLP.
[46] J. Fodor,et al. Connectionism and cognitive architecture: A critical analysis , 1988, Cognition.
[47] Emmanuel Dupoux,et al. Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies , 2016, TACL.
[48] Grzegorz Chrupala,et al. Representation of Linguistic Form and Function in Recurrent Neural Networks , 2016, CL.
[49] Demetri Terzopoulos,et al. Multilinear independent components analysis , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).
[50] Noah D. Goodman,et al. Evaluating Compositionality in Sentence Embeddings , 2018, CogSci.
[51] Kuldip K. Paliwal,et al. Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..
[52] Brenda Rapp,et al. Representation of letter position in spelling: Evidence from acquired dysgraphia , 2010, Cognition.
[53] Quoc V. Le,et al. Sequence to Sequence Learning with Neural Networks , 2014, NIPS.