Affective communication system with multimodality for a humanoid robot, AMI

Nonverbal communication plays a vital role in human interaction. To interact sociably with a human, a robot has to recognize and express emotions like a human. It also has to speak and determine its autonomous behavior while considering the emotional status of a human. In this paper, we present an affective human-robot communication system for a humanoid robot, AMI that we designed to communicate multi-modally with a human through dialogue. It communicates with humans by understanding and expressing nonverbal communication through channels such as facial expressions, voice, gestures and postures. Interaction between human and robot is made possible through the affective communication framework presented in this paper. The framework enables a robot to catch emotional status of current user and to respond appropriately. As a result, the robot naturally engages in dialogue with a human; it chooses appropriate conversation topics and behaves appropriately in response to human emotions. Moreover, the human partner perceives the robot to be more human-like and friendly, thus enhancing the interaction between the robot and human.