Laughter Type Recognition from Whole Body Motion

Despite the importance of laughter in social interactions it remains little studied in affective computing. Respiratory, auditory, and facial laughter signals have been investigated but laughter-related body movements have received almost no attention. The aim of this study is twofold: first an investigation into observers' perception of laughter states (hilarious, social, awkward, fake, and non-laughter) based on body movements alone, through their categorization of avatars animated with natural and acted motion capture data. Significant differences in torso and limb movements were found between animations perceived as containing laughter and those perceived as nonlaughter. Hilarious laughter also differed from social laughter in the amount of bending of the spine, the amount of shoulder rotation and the amount of hand movement. The body movement features indicative of laughter differed between sitting and standing avatar postures. Based on the positive findings in this perceptual study, the second aim is to investigate the possibility of automatically predicting the distributions of observer's ratings for the laughter states. The findings show that the automated laughter recognition rates approach human rating levels, with the Random Forest method yielding the best performance.

[1]  I Iandelli,et al.  Respiratory dynamics during laughter. , 2001, Journal of applied physiology.

[2]  Nadia Bianchi-Berthouze,et al.  An incremental and interactive affective posture recognition system , 2005 .

[3]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[4]  Kornel Laskowski,et al.  Contrasting emotion-bearing laughter types in multiparticipant vocal activity detection for meetings , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[5]  Radoslaw Niewiadomski,et al.  Towards Multimodal Expression of Laughter , 2012, IVA.

[6]  Victor B. Zordan,et al.  Laughing out loud: control for modeling anatomically inspired laughter using audio , 2008, SIGGRAPH Asia '08.

[7]  William Curran,et al.  Laughter induction techniques suitable for generating motion capture data of laughter associated body movements , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[8]  Maurizio Mancini,et al.  Computing and Evaluating the Body Laughter Index , 2012, HBU.

[9]  Andrea Kleinsmith,et al.  Multi-score Learning for Affect Recognition: The Case of Body Postures , 2011, ACII.

[10]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[11]  Jhing-Fa Wang,et al.  A real-time training-free laughter detection system based on novel syllable segmentation and correlation methods , 2012, 4th International Conference on Awareness Science and Technology.

[12]  William Curran,et al.  Human Perception of Laughter from Context-Free Whole Body Motion Dynamic Stimuli , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[13]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[14]  Hiroyuki Kajimoto,et al.  Laugh enhancer using laugh track synchronized with the user's laugh motion , 2010, CHI Extended Abstracts.

[15]  John Scott Bridle,et al.  Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition , 1989, NATO Neurocomputing.

[16]  Massimiliano Pontil,et al.  Transfer learning to account for idiosyncrasy in face and body expressions , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[17]  Radoslaw Niewiadomski,et al.  Smiling virtual agent in social context , 2012, Cognitive Processing.

[18]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[19]  Maja Pantic,et al.  Audiovisual discrimination between laughter and speech , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[20]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[21]  Jason Wong,et al.  Audiovisual Affect Recognition in Spontaneous Filipino Laughter , 2011, 2011 Third International Conference on Knowledge and Systems Engineering.

[22]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  E. Holt The last laugh: Shared laughter and topic termination , 2010 .

[24]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[25]  Thierry Dutoit,et al.  AVLaughterCycle : Enabling a virtual agent to join in laughing with a conversational partner using a similarity-driven audiovisual laughter animation (Original Paper) , 2010 .

[26]  Merlin Suarez,et al.  Discovering Emotions in Filipino Laughter Using Audio Features , 2010, 2010 3rd International Conference on Human-Centric Computing.

[27]  Robert P. Sheridan,et al.  Random Forest: A Classification and Regression Tool for Compound Classification and QSAR Modeling , 2003, J. Chem. Inf. Comput. Sci..

[28]  Russell Beale,et al.  User interactions with an affective nutritional coach , 2012, Interact. Comput..

[29]  Andrea Kleinsmith,et al.  Form as a Cue in the Automatic Recognition of Non-acted Affective Body Expressions , 2011, ACII.

[30]  Thierry Dutoit,et al.  Finding out the audio and visual features that influence the perception of laughter intensity and differ in inhalation and exhalation phases , 2012 .

[31]  Roddy Cowie,et al.  ILHAIRE Laughter Database , 2012 .

[32]  P. Ekman,et al.  The expressive pattern of laughter , 2001 .