A crossover code for high-dimensional composition

We present a novel way to encode compositional information in high-dimensional (HD) vectors. Inspired by chromosomal crossover, random HD vectors are recursively interwoven, with a fraction of one vector's components masked out and replaced by those from another using a context-dependent mask. Unlike many HD computing schemes, "crossover" codes highly overlap with their base elements' and sub-structures' codes without sacrificing relational information, allowing fast element readout and decoding by greedy reconstruction. Crossover is mathematically tractable and has several properties desirable for robust, flexible representation.

[1]  Ross W. Gayler Vector Symbolic Architectures answer Jackendoff's challenges for cognitive neuroscience , 2004, ArXiv.

[2]  Geoffrey E. Hinton,et al.  Distributed Representations , 1986, The Philosophy of Artificial Intelligence.

[3]  P. Kanerva,et al.  Permutations as a means to encode order in word space , 2008 .

[4]  Pentti Kanerva,et al.  Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors , 2009, Cognitive Computation.

[5]  James A. Anderson,et al.  A theory for the recognition of items from short memorized lists , 1973 .

[6]  Zachary Chase Lipton The mythos of model interpretability , 2016, ACM Queue.

[7]  George Kurian,et al.  Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.

[8]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[9]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[10]  Geoffrey E. Hinton,et al.  Distributed representations and nested compositional structure , 1994 .

[11]  Burton H. Bloom,et al.  Space/time trade-offs in hash coding with allowable errors , 1970, CACM.

[12]  Tony A. Plate,et al.  Holographic reduced representations , 1995, IEEE Trans. Neural Networks.

[13]  Dmitri A. Rachkovskij,et al.  Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning , 2001, Neural Computation.

[14]  Chris Eliasmith,et al.  Vector-Derived Transformation Binding: An Improved Binding Operation for Deep Symbol-Like Processing in Neural Networks , 2019, Neural Computation.

[15]  Ross W. Gayler Multiplicative Binding, Representation Operators & Analogy (Workshop Poster) , 1998 .

[16]  Lorenzo Rosasco,et al.  Holographic Embeddings of Knowledge Graphs , 2015, AAAI.

[17]  Wojciech Samek,et al.  Methods for interpreting and understanding deep neural networks , 2017, Digit. Signal Process..

[18]  Anthony M. Zador,et al.  A critique of pure learning and what artificial neural networks can learn from animal brains , 2019, Nature Communications.

[19]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[20]  Rob Fergus,et al.  Visualizing and Understanding Convolutional Networks , 2013, ECCV.

[21]  Pentti Kanerva,et al.  The Spatter Code for Encoding Concepts at Many Levels , 1994 .