Behavioral Indoor Navigation With Natural Language Directions

We describe a behavioral navigation approach that leverages the rich semantic structure of human environments to enable robots to navigate without an explicit geometric representation of the world. Based on this approach, we then present our efforts to allow robots to follow navigation instructions in natural language. With our proof-of-concept implementation, we were able to translate natural language navigation commands into a sequence of behaviors that could then be executed by a robot to reach a desired goal.

[1]  Geoffrey E. Hinton,et al.  Grammar as a Foreign Language , 2014, NIPS.

[2]  Juan Carlos Niebles,et al.  A Deep Learning Based Behavioral Approach to Indoor Autonomous Navigation , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Rahul Sukthankar,et al.  Cognitive Mapping and Planning for Visual Navigation , 2017, International Journal of Computer Vision.

[4]  Dieter Fox,et al.  Following directions using statistical machine translation , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Sebastian Thrun,et al.  Probabilistic robotics , 2002, CACM.

[6]  Matthew R. Walter,et al.  Listen, Attend, and Walk: Neural Mapping of Navigational Instructions to Action Sequences , 2015, AAAI.

[7]  Benjamin Kuipers,et al.  Modeling Spatial Knowledge , 1978, IJCAI.

[8]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .