Synthesizing Robot Programs with Interactive Tutor Mode

With the rapid development of the robotic industry, domestic robots have become increasingly popular. As domestic robots are expected to be personal assistants, it is important to develop a natural language-based human-robot interactive system for end-users who do not necessarily have much programming knowledge. To build such a system, we developed an interactive tutoring framework, named “Holert”, which can translate task descriptions in natural language to machine-interpretable logical forms automatically. Compared to previous works, Holert allows users to teach the robot by further explaining their intentions in an interactive tutor mode. Furthermore, Holert introduces a semantic dependency model to enable the robot to “understand” similar task descriptions. We have deployed Holert on an open-source robot platform, Turtlebot 2. Experimental results show that the system accuracy could be significantly improved by 163.9% with the support of the tutor mode. This system is also efficient. Even the longest task session with 10 sentences can be handled within 0.7 s.

[1]  Stefanie Tellex,et al.  Toward understanding natural language directions , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[3]  Peter Stone,et al.  Learning to Interpret Natural Language Commands through Human-Robot Dialog , 2015, IJCAI.

[4]  Tunga Güngör,et al.  Part-of-Speech Tagging , 2005 .

[5]  Nando de Freitas,et al.  A Deep Architecture for Semantic Parsing , 2014, ACL 2014.

[6]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[7]  Yang Liu,et al.  Modeling Coverage for Neural Machine Translation , 2016, ACL.

[8]  Matthias Scheutz,et al.  What to do and how to do it: Translating natural language directives into temporal and dynamic logic representation for goal management and action execution , 2009, 2009 IEEE International Conference on Robotics and Automation.

[9]  Jason Weston,et al.  A Neural Attention Model for Abstractive Sentence Summarization , 2015, EMNLP.

[10]  Mirella Lapata,et al.  Language to Logical Form with Neural Attention , 2016, ACL.

[11]  Andrew Chou,et al.  Semantic Parsing on Freebase from Question-Answer Pairs , 2013, EMNLP.

[12]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[13]  Bruce A. MacDonald,et al.  RoboStudio: A visual programming environment for rapid authoring and customization of complex services on a personal service robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Gregory Dudek,et al.  Graphical State Space Programming: A visual programming paradigm for robot task specification , 2011, 2011 IEEE International Conference on Robotics and Automation.

[15]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[16]  Blaine Nelson,et al.  Support Vector Machines Under Adversarial Label Noise , 2011, ACML.

[17]  Raymond J. Mooney,et al.  A Statistical Semantic Parser that Integrates Syntax and Semantics , 2005, CoNLL.

[18]  Hong Liu,et al.  Singularity robust path planning for real time base attitude adjustment of free-floating space robot , 2017, Int. J. Autom. Comput..

[19]  Luke S. Zettlemoyer,et al.  Bootstrapping Semantic Parsers from Conversations , 2011, EMNLP.

[20]  Maya Cakmak,et al.  RoboFlow: A flow-based visual programming language for mobile manipulation tasks , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Ananthram Swami,et al.  The Limitations of Deep Learning in Adversarial Settings , 2015, 2016 IEEE European Symposium on Security and Privacy (EuroS&P).

[22]  Yunyi Jia,et al.  Back to the Blocks World: Learning New Actions through Situated Human-Robot Dialogue , 2014, SIGDIAL Conference.

[23]  Benjamin Kuipers,et al.  Walk the Talk: Connecting Language, Knowledge, and Action in Route Instructions , 2006, AAAI.

[24]  Luke S. Zettlemoyer,et al.  Learning to Parse Natural Language Commands to a Robot Control System , 2012, ISER.

[25]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[26]  Javier Ramírez De La Pinta,et al.  Integration of service robots in the smart home by means of UPnP: A surveillance robot case study , 2013, Robotics Auton. Syst..

[27]  Kai Zhao,et al.  Type-Driven Incremental Semantic Parsing with Polymorphism , 2014, NAACL.

[28]  Jason Weston,et al.  A Neural Attention Model for Sentence Summarization , 2015 .

[29]  Jean Scholtz,et al.  Theory and evaluation of human robot interactions , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[30]  Manuela M. Veloso,et al.  Using dialog and human observations to dictate tasks to a learning robot assistant , 2008, Intell. Serv. Robotics.

[31]  Danqi Chen,et al.  A Fast and Accurate Dependency Parser using Neural Networks , 2014, EMNLP.

[32]  Ruslan Mitkov,et al.  The Oxford handbook of computational linguistics , 2003 .

[33]  Robert Laddaga,et al.  A location representation for generating descriptive walking directions , 2005, IUI.

[34]  Guido Bugmann,et al.  Corpus-Based Robotics: A Route Instruction Example , 2003 .

[35]  María del Pilar Almudena García Fuente,et al.  Service Robotics within the Digital Home: Applications and Future Prospects , 2011 .

[36]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[37]  Yutaka Nakamura,et al.  Analysis of Motor Synergies Utilization for Optimal Movement Generation for a Human-like Robotic Arm , 2013, Int. J. Autom. Comput..