Solving Bidirectional Tasks using MTRNN

In this paper we study the learning of bidirectional tasks in a Recurrent Neural Network (RNN). Most of such models deal with a flow of information in only one direction, either generating outputs or encoding inputs; However, using a single network to do both tasks simultaneously would be more efficient and biologically plausible. We will be using a Multiple Timescales Recurrent Neural Network (MTRNN) to solve these tasks. The network proves capable of dealing with this bidirectional-flow of information simply by training in both directions, with outputs becoming inputs and vice-versa. We showcase this behaviour on two tasks, using the same network. the first is a sentence learning task, akin to a classification problem. The second task is a motor trajectory learning task, akin to a regression problem. The data used in these tasks has been generated through an iCub robot. We present the results of these experiments and show that this model maintains its properties for the bidirectional tasks. We discuss possible future implementations using this ability to solve more complex scenarios such as action and language grounding.

[1]  Giulio Sandini,et al.  The iCub humanoid robot: An open-systems platform for research in cognitive development , 2010, Neural Networks.

[2]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[3]  L. Fadiga,et al.  Active perception: sensorimotor circuits as a cortical basis for language , 2010, Nature Reviews Neuroscience.

[4]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[5]  Marc'Aurelio Ranzato,et al.  Sequence Level Training with Recurrent Neural Networks , 2015, ICLR.

[6]  Yuichi Nakamura,et al.  Approximation of dynamical systems by continuous time recurrent neural networks , 1993, Neural Networks.

[7]  Tetsuya Ogata,et al.  Emergence of hierarchical structure mirroring linguistic composition in a recurrent neural network , 2011, Neural Networks.

[8]  Stefan Wermter,et al.  Embodied Language Understanding with a Multiple Timescale Recurrent Neural Network , 2013, ICANN.

[9]  Jun Tani,et al.  Achieving "organic compositionality" through self-organization: Reviews on brain-inspired robotics experiments , 2008, Neural Networks.

[10]  Frank K. Soong,et al.  TTS synthesis with bidirectional LSTM based recurrent neural networks , 2014, INTERSPEECH.

[11]  Jun Tani,et al.  Emergence of Functional Hierarchy in a Multiple Timescale Neural Network Model: A Humanoid Robot Experiment , 2008, PLoS Comput. Biol..

[12]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[13]  Minho Lee,et al.  Integrative Learning between Language and Action: A Neuro-Robotics Experiment , 2010, ICANN.

[14]  M. Tomasello The item-based nature of children’s early syntactic development , 2000, Trends in Cognitive Sciences.

[15]  Peter Ford Dominey,et al.  Exploring the acquisition and production of grammatical constructions through human-robot interaction with echo state networks , 2014, Front. Neurorobot..

[16]  Ozan Irsoy,et al.  Deep Sequential and Structural Neural Models of Compositionality , 2017 .

[17]  Alex Graves,et al.  Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.

[18]  Angelo Cangelosi,et al.  Sensorimotor input as a language generalisation tool: a neurorobotics model for generation and generalisation of noun-verb combinations with sensorimotor inputs , 2016, Autonomous Robots.

[19]  Angelo Cangelosi,et al.  Multiple Time Scales Recurrent Neural Network for Complex Action Acquisition , 2011 .

[20]  Sanjiv Kumar,et al.  On the Convergence of Adam and Beyond , 2018 .

[21]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.