Emergent Language Generalization and Acquisition Speed are not tied to Compositionality

Studies of discrete languages emerging when neural agents communicate to solve a joint task often look for evidence of compositional structure. This stems for the expectation that such a structure would allow languages to be acquired faster by the agents and enable them to generalize better. We argue that these beneficial properties are only loosely connected to compositionality. In two experiments, we demonstrate that, depending on the task, non-compositional languages might show equal, or better, generalization performance and acquisition speed than compositional ones. Further research in the area should be clearer about what benefits are expected from compositionality, and how the latter would lead to them.

[1]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[2]  Stephen Clark,et al.  Emergence of Linguistic Communication from Referential Games with Symbolic and Pixel Input , 2018, ICLR.

[3]  Michael Cogswell,et al.  Emergence of Compositional Language with Deep Generational Transmission , 2019, ArXiv.

[4]  Eugene Kharitonov,et al.  EGG: a toolkit for research on Emergence of lanGuage in Games , 2019, EMNLP.

[5]  Dag Westerståhl,et al.  Compositionality I: Definitions and Variants , 2010 .

[6]  José M. F. Moura,et al.  Natural Language Does Not Emerge ‘Naturally’ in Multi-Agent Dialog , 2017, EMNLP.

[7]  Ivan Titov,et al.  Emergence of Language with Multi-agent Games: Learning to Communicate with Sequences of Symbols , 2017, NIPS.

[8]  Shimon Whiteson,et al.  Learning to Communicate with Deep Multi-Agent Reinforcement Learning , 2016, NIPS.

[9]  K. Zuberbühler,et al.  Compositionality in animals and humans , 2018, PLoS biology.

[10]  Alexander Peysakhovich,et al.  Multi-Agent Cooperation and the Emergence of (Natural) Language , 2016, ICLR.

[11]  Michael Bowling,et al.  Ease-of-Teaching and Language Structure from Emergent Communication , 2019, NeurIPS.

[12]  Dag Westerståhl,et al.  Compositionality II: Arguments and Problems , 2010 .

[13]  Andrew M. Dai,et al.  Capacity, Bandwidth, and Compositionality in Emergent Language Learning , 2020, AAMAS.

[14]  Nando de Freitas,et al.  Compositional Obverter Communication Learning From Raw Visual Input , 2018, ICLR.

[15]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[16]  Eugene Kharitonov,et al.  Anti-efficient encoding in emergent communication , 2019, NeurIPS.

[17]  Jacob Andreas,et al.  Measuring Compositionality in Representation Learning , 2019, ICLR.

[18]  Bernhard Schölkopf,et al.  Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations , 2018, ICML.

[19]  Eugene Kharitonov,et al.  Compositionality and Generalization In Emergent Languages , 2020, ACL.

[20]  Pascal Vincent,et al.  Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Pieter Abbeel,et al.  Emergence of Grounded Compositional Language in Multi-Agent Populations , 2017, AAAI.