Motion Analysis in Vocalized Surprise Expressions

The background of our research is the generation of natural human-like motions during speech in android robots that have a highly human-like appearance. Mismatches in speech and motion are sources of unnaturalness, especially when emotion expressions are involved. Surprise expressions often occur in dialogue interactions, and they are often accompanied by verbal interjectional utterances. In this study, we analyze facial, head and body motions during several types of vocalized surprise expressions appearing in human-human dialogue interactions. The analysis results indicate an interdependence between motion types and different types of surprise expression (such as emotional, social or quoted) as well as different degrees of surprise expression. The synchronization between motion and surprise utterances is also analyzed.

[1]  Pramodita Sharma 2012 , 2013, Les 25 ans de l’OMC: Une rétrospective en photos.

[2]  Hiroshi Ishiguro,et al.  Analysis of relationship between head motion events and speech in dialogue conversations , 2014, Speech Communication.

[3]  Danilo De Rossi,et al.  HEFES: An Hybrid Engine for Facial Expressions Synthesis to control human-like androids and avatars , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[4]  Dong-Wook Lee,et al.  Designing of android head system by applying facial muscle mechanism of humans , 2012, 2012 12th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2012).

[5]  Roxane Bertrand,et al.  About the relationship between eyebrow movements and Fo variations , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[6]  T. Tsuji,et al.  Development of the Face Robot SAYA for Rich Facial Expressions , 2006, 2006 SICE-ICASE International Joint Conference.

[7]  Rahul Sathawane,et al.  Analysis of Emotion Recognition using Facial Expressions, using Bezier curve , 2015 .

[8]  Hiroshi Ishiguro,et al.  Head motion during dialogue speech and nod timing control in humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Peter Robinson,et al.  Decoupling facial expressions and head motions in complex emotions , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[10]  D. Massaro,et al.  Perceiving affect from the voice and the face , 1996, Psychonomic bulletin & review.

[11]  P. Ekman,et al.  Head and body cues in the judgment of emotion: a reformulation. , 1967, Perceptual and motor skills.

[12]  David Loza,et al.  Application of the FACS in the design and construction of a mechatronic head with realistic appearance , 2013 .

[13]  Bülent Özgüç,et al.  Realistic speech animation of synthetic faces , 1998, Proceedings Computer Animation '98 (Cat. No.98EX169).

[14]  Hong-Seok Kim,et al.  Development of an Android for Emotional Expression and Human Interaction , 2008 .

[15]  Hiroshi Ishiguro,et al.  Generation of Nodding, Head tilting and Gazing for Human-Robot speech Interaction , 2013, Int. J. Humanoid Robotics.

[16]  Takashi Minato,et al.  Online speech-driven head motion generating system and evaluation on a tele-operated robot , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[17]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[18]  João Sequeira,et al.  A Multimodal Emotion Detection System during Human-Robot Interaction , 2013, Sensors.