Motion modification method to control affective nuances for robots

In human-robot interaction, robots often fail to lead humans to intended reactions due to their limited ability to express affective nuances. In this paper, we propose a motion modification method that combines affective nuances with arbitrary motions of humanoid robots to induce humans to intended reactions by expressing affective states. The method is applicable to various humanoid robots that differ in degrees of freedom or appearances, and the affective nuances are parametrically expressed in a two-dimensional model comprised of valence and arousal. The experimental results showed that the desired affective nuances could be expressed by our method, but it also suggested some limitations. We believe that the method will contribute to interactive systems in which robots can communicate with appropriate expressions in various contexts.

[1]  J. Russell A circumplex model of affect. , 1980 .

[2]  Fumihide Tanaka,et al.  Socialization between toddlers and robots at an early childhood education center , 2007, Proceedings of the National Academy of Sciences.

[3]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[4]  Takanori Komatsu Subtle expressivity for making humans estimate certain attitudes , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[5]  Taketoshi Mori,et al.  Quantitative Analysis of Impression of Robot Bodily Expression based on Laban Movement Theory , 2001 .

[6]  William Thomas James,et al.  A Study of the Expression of Bodily Posture , 1932 .

[7]  Yukiko I. Nakano,et al.  Non-Verbal Cues for Discourse Structure , 2022 .

[8]  Cynthia Breazeal,et al.  Experiments with a robotic computer: Body, affect and cognition interactions , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Tomio Watanabe,et al.  InterRobot: speech-driven embodied interaction robot , 2001, Adv. Robotics.

[10]  P. Andersen,et al.  Handbook of Communication and Emotion , 1998 .

[11]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[12]  Aaron Hertzmann,et al.  Style machines , 2000, SIGGRAPH 2000.

[13]  P. Ekman,et al.  Head and body cues in the judgment of emotion: a reformulation. , 1967, Perceptual and motor skills.

[14]  Michael Neff,et al.  AER: aesthetic exploration and refinement for expressive character animation , 2005, SCA '05.

[15]  Laura K. Guerrero,et al.  Handbook of communication and emotion : research, theory, applications, and contexts , 1998 .

[16]  Tetsuo Ono,et al.  Embodied communications between humans and robots emerging from entrained gestures , 2003, Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No.03EX694).

[17]  Tomomasa Sato,et al.  Analysis of Impression of Robot Bodily Expression , 2002, J. Robotics Mechatronics.

[18]  Takayuki Kanda,et al.  Interactive Humanoid Robots for a Science Museum , 2006, IEEE Intelligent Systems.