Efficient Natural Language Interfaces for Assistive Robots

Language is a powerful tool that enables humans and robots to interact without the need for complex graphical interfaces. Statistical techniques for interpreting the meaning of utterances in complex domestic environments, however, remain computationally intensive and are prone to error. Herein we present a model for language understanding that uses parse trees and environment models to infer both the structure and the meaning of probabilistic graphical models for symbol grounding. This model, called the Hierarchical Distributed Correspondence Graph (HDCG), exploits information about symbols that are expressed in the corpus to learn rules that describe how to construct graphical models that are faster to search. In a comparative experiment, we observe an order of magnitude improvement in the speed of probabilistic inference over the Distributed Correspondence Graph (DCG) model. We conclude with a discussion of potential applications in rehabilitation and assistive robotics and future directions of research.

[1]  Manuel Mazo,et al.  EOG guidance of a wheelchair using neural networks , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[2]  N. Hogan,et al.  Increasing productivity and quality of care: robot-aided neuro-rehabilitation. , 2000, Journal of rehabilitation research and development.

[3]  N. Shimada A Robotic Wheelchair Based on the Integration of Human and Environmental Observations Look Where You ’ re Going , 2001 .

[4]  R. Simpson Smart wheelchairs: A literature review. , 2005, Journal of rehabilitation research and development.

[5]  Benjamin Kuipers,et al.  Walk the Talk: Connecting Language, Knowledge, and Action in Route Instructions , 2006, AAAI.

[6]  M. Sahnoun,et al.  Assisted Control Mode for a Smart Wheelchair , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[7]  Raymond J. Mooney,et al.  Learning to Connect Language and Perception , 2008, AAAI.

[8]  Hermano Igo Krebs,et al.  Robot-Aided Neurorehabilitation: A Novel Robot for Ankle Rehabilitation , 2009, IEEE Transactions on Robotics.

[9]  Stefanie Tellex,et al.  Toward understanding natural language directions , 2010, HRI 2010.

[10]  Raymond J. Mooney,et al.  Learning to Interpret Natural Language Navigation Instructions from Observations , 2011, Proceedings of the AAAI Conference on Artificial Intelligence.

[11]  Matthew R. Walter,et al.  Understanding Natural Language Commands for Robotic Navigation and Mobile Manipulation , 2011, AAAI.

[12]  Luke S. Zettlemoyer,et al.  Learning to Parse Natural Language Commands to a Robot Control System , 2012, ISER.

[13]  N. Roy,et al.  Improving safety and operational efficiency in residential care settings with WiFi-based localization. , 2012, Journal of the American Medical Directors Association.

[14]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[15]  Matthew R. Walter,et al.  Learning Semantic Maps from Natural Language Descriptions , 2013, Robotics: Science and Systems.

[16]  Maja J. Mataric,et al.  Using semantic fields to model dynamic spatial relations in a robot architecture for natural language instruction of service robots , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Stefanie Tellex,et al.  A natural language planner interface for mobile manipulators , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Luke S. Zettlemoyer,et al.  Learning from Unscripted Deictic Gesture and Language for Human-Robot Interactions , 2014, AAAI.

[19]  Jean Oh,et al.  Inferring Maps and Behaviors from Natural Language Instructions , 2015, ISER.