Motion generation in android robots during laughing speech

We are dealing with the problem of generating natural human-like motions during speech in android robots, which have human-like appearances. So far, automatic generation methods have been proposed for lip and head motions of tele-presence robots, based on the speech signal of the tele-operator. In the present study, we aim for extending the speech-driven motion generation methods for laughing speech, since laughter often occurs in natural dialogue interactions and may cause miscommunication if there is mismatch between audio and visual modalities. Based on analysis results of human behaviors during laughing speech, we proposed a motion generation method given the speech signal and the laughing speech intervals. Subjective experiments were conducted using our android robot by generating five different motion types, considering several modalities. Evaluation results show the effectiveness of controlling different parts of the face, head and upper body (eyelid narrowing, lip corner/cheek raising, eye blinking, head motion and upper body motion control).

[1]  P. Ekman,et al.  The Duchenne smile: emotional expression and brain physiology. II. , 1990, Journal of personality and social psychology.

[2]  Hiroshi Ishiguro,et al.  Audiovisual analysis of relations between laughter types and laughter motions , 2016 .

[3]  Hiroshi Ishiguro,et al.  Head motions during dialogue speech and nod timing control in humanoid robots , 2010, HRI 2010.

[4]  Radoslaw Niewiadomski,et al.  Rhythmic Body Movements of Laughter , 2014, ICMI.

[5]  Atsuo Takanishi,et al.  Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities- , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.

[6]  Takashi Minato,et al.  Online speech-driven head motion generating system and evaluation on a tele-operated robot , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[7]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[8]  L. Devillers,et al.  Positive and Negative emotional states behind the laughs in spontaneous spoken dialogs , 2007 .

[9]  Radoslaw Niewiadomski,et al.  Laugh-aware virtual agent and its impact on user amusement , 2013, AAMAS.

[10]  Radoslaw Niewiadomski,et al.  Towards Multimodal Expression of Laughter , 2012, IVA.

[11]  Hiroshi Ishiguro,et al.  Generation of Nodding, Head tilting and Gazing for Human-Robot speech Interaction , 2013, Int. J. Humanoid Robotics.

[12]  Hiroshi Ishiguro,et al.  Evaluation of formant-based lip motion generation in tele-operated humanoid robots , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  N. Campbell Whom we laugh with affects how we laugh , 2007 .

[14]  Catherine Pelachaud,et al.  Laughter animation synthesis , 2014, AAMAS.

[15]  Hiroshi Ishiguro,et al.  Head motion during dialogue speech and nod timing control in humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).