Recovering from failure by asking for help Citation

Robots inevitably fail, often without the ability to recover autonomously. We demonstrate an approach for enabling a robot to recover from failures by communicating its need for specific help to a human partner using natural language. Our approach automatically detects failures, then generates targeted spoken-language requests for help such as “Please give me the white table leg that is on the black table.” Once the human partner has repaired the failure condition, the system resumes full autonomy. We present a novel inverse semantics algorithm for generating effective help requests. In contrast to forward semantic models that interpret natural language in terms of robot actions and perception, our inverse semantics algorithm generates requests by emulating the human’s ability to interpret a request using theGeneralizedGroundingGraph (G3) framework. To assess the effectiveness of our approach, we present a corpusbased online evaluation, as well as an end-to-end user study, demonstrating that our approach increases the effectiveness of human interventions compared to static requests for help. Ross A. Knepper and Stefanie Tellex have contributed equally to this paper. This is one of several papers published in Autonomous Robots comprising the “Special Issue on Robotics Science and Systems”. B Ross A. Knepper rak@cs.cornell.edu 1 Department of Computer Science, Cornell University, Ithaca, USA 2 Computer Science Department, Brown University, Providence, USA 3 Department of Engineering, University of Cambridge, Cambridge, UK 4 Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, USA

[1]  Adrian Akmajian,et al.  Linguistics: An Introduction to Language and Communication , 1979 .

[2]  Francesco Orilia,et al.  Semantics and Cognition , 1991 .

[3]  Randall H. Wilson Minimizing user queries in interactive assembly planning , 1995, IEEE Trans. Robotics Autom..

[4]  Irene Heim,et al.  Semantics in generative grammar , 1998 .

[5]  David Kortenkamp,et al.  Adjustable Autonomy for Human-Centered Autonomous Systems on Mars , 1998 .

[6]  Craig A. Knoblock,et al.  PDDL-the planning domain definition language , 1998 .

[7]  James H. Martin,et al.  Speech and language processing: an introduction to natural language processing, computational linguistics, and speech recognition, 2nd Edition , 2000, Prentice Hall series in artificial intelligence.

[8]  Ehud Reiter,et al.  Book Reviews: Building Natural Language Generation Systems , 2000, CL.

[9]  D. Roy A TRAINABLE VISUALLY-GROUNDED SPOKEN LANGUAGE GENERATION SYSTEM , 2002 .

[10]  Terrence Fong,et al.  Robot, asker of questions , 2003, Robotics Auton. Syst..

[11]  Christopher D. Manning,et al.  Generating Typed Dependency Parses from Phrase Structure Parses , 2006, LREC.

[12]  Benjamin Kuipers,et al.  Walk the Talk: Connecting Language, Knowledge, and Action in Route Instructions , 2006, AAAI.

[13]  Frederik W. Heger,et al.  Human-Robot Teams for Large-Scale Assembly , 2007 .

[14]  Matthias Scheutz,et al.  What to do and how to do it: Translating natural language directives into temporal and dynamic logic representation for goal management and action execution , 2009, 2009 IEEE International Conference on Robotics and Automation.

[15]  Dan Klein,et al.  A Game-Theoretic Approach to Generating Spatial Descriptions , 2010, EMNLP.

[16]  Pieter Abbeel,et al.  Cloth grasp point detection based on multiple-view geometric cues with application to robotic towel folding , 2010, 2010 IEEE International Conference on Robotics and Automation.

[17]  Stefanie Tellex,et al.  Toward understanding natural language directions , 2010, HRI 2010.

[18]  Raymond J. Mooney,et al.  Learning to Interpret Natural Language Navigation Instructions from Observations , 2011, Proceedings of the AAAI Conference on Artificial Intelligence.

[19]  Mariët Theune,et al.  Report on the Second Second Challenge on Generating Instructions in Virtual Environments (GIVE-2.5) , 2011, ENLG.

[20]  Matthew R. Walter,et al.  Understanding Natural Language Commands for Robotic Navigation and Mobile Manipulation , 2011, AAAI.

[21]  Stephanie Rosenthal,et al.  Learning Accuracy and Availability of Humans Who Help Mobile Robots , 2011, AAAI.

[22]  Alexander Koller,et al.  Combining symbolic and corpus-based approaches for the generation of successful referring expressions , 2011, ENLG.

[23]  Noah D. Goodman,et al.  Knowledge and implicature: Modeling language understanding as social cognition , 2012, CogSci.

[24]  Luke S. Zettlemoyer,et al.  A Joint Model of Language and Perception for Grounded Attribute Learning , 2012, ICML.

[25]  Edwin Olson,et al.  DART: A particle-based method for generating easy-to-follow directions , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Stefanie Tellex,et al.  Interpreting and Executing Recipes with a Cooking Robot , 2012, ISER.

[27]  Emiel Krahmer,et al.  Computational Generation of Referring Expressions: A Survey , 2012, CL.

[28]  Siddhartha S. Srinivasa,et al.  Generating Legible Motion , 2013, Robotics: Science and Systems.

[29]  Ross A. Knepper,et al.  IkeaBot: An autonomous multi-robot coordinated furniture assembly system , 2013, 2013 IEEE International Conference on Robotics and Automation.

[30]  Maja J. Mataric,et al.  Using semantic fields to model dynamic spatial relations in a robot architecture for natural language instruction of service robots , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  Christopher Potts,et al.  Implicatures and Nested Beliefs in Approximate Decentralized-POMDPs , 2013, ACL.

[32]  Christopher Potts,et al.  Emergence of Gricean Maxims from Multi-Agent Decision Theory , 2013, NAACL.

[33]  Ross A. Knepper,et al.  Asking for Help Using Inverse Semantics , 2014, Robotics: Science and Systems.