Heading toward to the natural way of human-machine interaction: the nimitek project

Spoken human-machine interaction supported by state-ofthe- art dialog systems is becoming a standard technology. A lot of effort was invested for this kind of artificial communication interface. But still the spoken dialog systems (SDS) are not able to provide for the user a natural way of communication. Because, existing automated dialog system do not dedicate enough attention to problems in the interaction related to affected user behavior. This paper addresses some aspects of design and implementation of user behavior models in dialog systems aimed to provide naturalness of human-machine interaction. We discuss a viable integration technique of speech based emotion classification in SDS for robust affected automatic speech recognition and user emotion correlated dialog strategy. First of all, we describe existing methods of emotion recognition within speech and affected speech adapted ASR methods. Second, we introduce an approach to achieve emotion adaptive dialog management in human-machine interaction. A multimodal human-machine interaction system with integrated user behavior model is created within the project “Neurobiologically Inspired, Multimodal Intention Recognition for Technical Communication Systems” (NIMITEK). Currently NIMITEK provides a technical demonstrator to study these principles in a dedicated prototypical task, namely solving the game Towers of Hanoi. In this paper, we will describe the general approach NIMITEK takes to emotional man-machine interactions.