Language and Robotics: Complex Sentence Understanding

Existing robotic systems can take actions based on natural language commands but they tend to be only simple commands. On the other hand, in the domain of Natural Language Processing (NLP), complex sentences are processed, but this NLP domain does not make close contact with robotics. The beginning of computer processing of natural language, when traced back to a system such as Winograd’s SHRUDLU, conceived in 1973, actually aimed to address the issues of Natural Language Understanding (NLU) of relatively complex sentences by a robotic system which in turn takes actions accordingly based on the natural language input. NLU, in the robotic context, thus constitutes taking the correct actions from language instructions. This paper explores the use of cognitive linguistic constructs as well as other constructs such as spatial relationship constructs to configure an NLU system for translating complex natural language instructions into actions to be taken by a robot. This research work illustrates that two important steps are necessary: the first step is to translate a language-dependent surface sentential structure into a language independent deep-level predicate representation, and then the next step is to translate the predicate representation into grounded real-world references and constructs that enable a robot to carry out the language instructions accordingly.