Emergent Multilingual Language Acquisition Using Developmental Networks

There has been work on language acquisition but such prior work was based on symbolic representations and non-incremental learning. Neural Networks are meant for incremental learning but their performance has been weak. This situation was mainly due to a "lack of logic" in neural networks. By language acquisition here we mean incremental learning from lifetime experience. Since developmental networks (DN) has clearly understandable emergent "logic" in terms of finite automata and Turing machines, this is the first work on language acquisition based on a clearly understandable emergent Turing machine. We show how symbolic words are represented by patterns instead of (handcrafted) symbols to simulate naturally grounded and emergent inputs. The context as states/actions are also represented by patterns to simulate naturally grounded and emergent inputs. Our work demonstrates that symbolic state features can be fully automated by emergent input-context pattern pairs. This is a step toward fully automated acquisition of language by a grounded robot, but we are not there yet.

[1]  Juyang Weng,et al.  Actions as contexts , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[2]  Ricarda I. Schubotz,et al.  Prediction, Cognition and the Brain , 2009, Front. Hum. Neurosci..

[3]  James H. Martin,et al.  Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition , 2000 .

[4]  Juyang Weng,et al.  The short-context priority of emergent representations in unsupervised learning , 2014, 2014 10th International Conference on Natural Computation (ICNC).

[5]  Leonid I. Perlovsky,et al.  Language and cognition , 2009, Neural Networks.

[6]  Juyang Weng,et al.  Motivated Optimal Developmental Learning for Sequential Tasks Without Using Rigid Time-Discounts , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Juyang Weng,et al.  Brain as an Emergent Finite Automaton: A Theory and Three Theorems , 2015 .

[8]  Jürgen Schmidhuber,et al.  Connectionist temporal classification: labelling unsegmented sequence data with recurrent neural networks , 2006, ICML.

[9]  Jürgen Schmidhuber,et al.  Bidirectional LSTM Networks for Improved Phoneme Classification and Recognition , 2005, ICANN.

[10]  Andrew J King,et al.  Sensory cortex is optimized for prediction of future input , 2017, bioRxiv.

[11]  Peter Ford Dominey,et al.  A Recurrent Neural Network for Multiple Language Acquisition: Starting with English and French , 2015, CoCo@NIPS.

[12]  J. C. Martin,et al.  Introduction to Languages and the Theory of Computation" 3rd Ed , 1991 .

[13]  Juyang Weng Brain Like Temporal Processing , 2011, Bio-Inspired Self-Organizing Robotic Systems.

[14]  Juyang Weng,et al.  A Model for Auto-Programming for General Purposes , 2018, ArXiv.

[15]  Yoshua Bengio,et al.  A Neural Probabilistic Language Model , 2003, J. Mach. Learn. Res..

[16]  Hagen Soltau,et al.  Neural Speech Recognizer: Acoustic-to-Word LSTM Model for Large Vocabulary Speech Recognition , 2016, INTERSPEECH.

[17]  Dongshu Wang,et al.  Natural Language Acquisition: State Inferring and Thinking , 2016, Int. J. Artif. Intell. Tools.

[18]  Juyang Weng,et al.  WWN-text: Cortex-like language acquisition with “what” and “where” , 2010, 2010 IEEE 9th International Conference on Development and Learning.

[19]  Brian Roark,et al.  Discriminative Language Modeling with Conditional Random Fields and the Perceptron Algorithm , 2004, ACL.

[20]  Angelo Cangelosi,et al.  Grounding language in action and perception: from cognitive agents to humanoid robots. , 2010, Physics of life reviews.