Semantic-Based Interaction for Teaching Robot Behavior Compositions Using Spoken Language

By enabling users to teach behaviors to robots, social robots become more adaptable, and therefore more acceptable. We improved an application for teaching behaviors to support conditions closer to the real-world: it supports spoken instructions, and remain compatible the robot’s other purposes. We introduce a novel architecture to enable 5 distinct algorithms to compete with each other, and a novel teaching algorithm that remain robust with these constraints: using linguistics and semantics, it can recognize when the dialogue context is adequate. We carry out an adaptation of a previous experiment, so that to produce comparable results, demonstrate that all participants managed to teach new behaviors, and partially verify our hypotheses about how users naturally break down the teaching instructions.

[1]  Guido Bugmann,et al.  Training Personal Robots Using Natural Language Instruction , 2001, IEEE Intell. Syst..

[2]  Peter Ford Dominey,et al.  Improving quality of life with a narrative companion , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[3]  Song-Chun Zhu,et al.  Jointly Learning Grounded Task Structures from Language Instruction and Visual Demonstration , 2016, EMNLP.

[4]  Daniele Nardi,et al.  Teaching Robots Parametrized Executable Plans Through Spoken Interaction , 2015, AAMAS.

[5]  Maya Cakmak,et al.  Power to the People: The Role of Humans in Interactive Machine Learning , 2014, AI Mag..

[6]  Matthias Scheutz,et al.  Spoken Instruction-Based One-Shot Object and Action Learning in a Cognitive Robotic Architecture , 2017, AAMAS.

[7]  Mohamed Chetouani,et al.  Semantic-based interaction for teaching robot behavior compositions , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[8]  Manuela M. Veloso,et al.  Interactive robot task training through dialog and demonstration , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  John R. Anderson,et al.  Interactive Task Learning , 2017, IEEE Intelligent Systems.

[10]  Moritz Tenorth,et al.  CRAM — A Cognitive Robot Abstract Machine for everyday manipulation in human environments , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Giorgio Metta,et al.  iCub-HRI: A Software Framework for Complex Human–Robot Interaction Scenarios on the iCub Humanoid Robot , 2018, Front. Robot. AI.

[12]  Peter Ford Dominey,et al.  Cognitive Robotics: Command, Interrogation and Teaching in Robot Coaching , 2006, RoboCup.

[13]  Andrea Lockerd Thomaz,et al.  Robot Learning from Human Teachers , 2014, Robot Learning from Human Teachers.

[14]  Matthias Scheutz,et al.  Recursive Spoken Instruction-Based One-Shot Object and Action Learning , 2018, IJCAI.

[15]  Matthias Scheutz,et al.  DIARC: A Testbed for Natural Human-Robot Interaction , 2006, AAAI.

[16]  Guillaume Gibert,et al.  Proof of concept for a user-centered system for sharing cooperative plan knowledge over extended periods and crew changes in space-flight operations , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[17]  Guido Bugmann,et al.  Personal Robot Training via Natural-Language Instructions. , 2001 .

[18]  Moritz Tenorth,et al.  Understanding and executing instructions for everyday manipulation tasks from the World Wide Web , 2010, 2010 IEEE International Conference on Robotics and Automation.

[19]  J. Gregory Trafton,et al.  ACT-R/E , 2013, HRI 2013.

[20]  Pierre-Yves Oudeyer,et al.  Pragmatic Frames for Teaching and Learning in Human–Robot Interaction: Review and Challenges , 2016, Front. Neurorobot..

[21]  Charles J. Fillmore,et al.  Frames and the semantics of understanding , 1985 .

[22]  Jörg Conradt,et al.  Serendipitous Offline Learning in a Neuromorphic Robot , 2016, Front. Neurorobot..