Interpretation of Situated Human-Robot Dialogues

The development of intelligent interfaces for human-machine interaction is currently an active research challenge. An important requirement for mobile robot assistants in office and home environment is to support multi-modal interaction with the user. This paper presents the multi-modal dialog system for BIRON - the Bielefeld Robot Companion. The main focus of this system is to interpret spontaneous speech in context of mobile robot interaction. To fulfill the need of the real-world scenario the system also supports multi-modal information as gestures or objects in the scenario. Therefore, the dialog manager and the understanding component are connected bidirectionally to enable the integration of these information into the process of speech understanding.