Associating gesture expressivity with affective representations

Affective computing researchers adopt a variety of methods in analysing or synthesizing aspects of human behaviour. The choice of method depends on which behavioural cues are considered salient or straightforward to capture and comprehend, as well as the overall context of the interaction. Thus, each approach focuses on modelling certain information and results to dedicated representations. However, analysis or synthesis is usually done by following label-based representations, which usually have a direct mapping to a feature vector. The goal of the presented work is to introduce an interim representational mechanism that associates low-level gesture expressivity parameters with a high-level dimensional representation of affect. More specifically, it introduces a novel methodology for associating easily extracted, low-level gesture data to the affective dimensions of activation and evaluation. For this purpose, a user perception test was carried out in order to properly annotate a dataset, by asking participants to assess each gesture in terms of the perceived activation (active/passive) and evaluation (positive/negative) levels. In affective behaviour modelling, the contribution of the proposed association methodology is twofold: On one hand, when analysing affective behaviour, it can enable the fusion of expressivity parameters alongside with any other modalities coded in higher-level affective representations, leading, in this way, to scalable multimodal analysis. On the other hand, it can enforce the process of synthesizing composite human behaviour (e.g. facial expression, gestures and body posture) since it allows for the translation of dimensional values of affect into synthesized expressive gestures.

[1]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Antonio Camurri,et al.  Toward a Minimal Representation of Affective Gestures , 2011, IEEE Transactions on Affective Computing.

[3]  S. Kollias,et al.  Synthesizing Gesture Expressivity Based on Real Sequences , 2006 .

[4]  Catherine Pelachaud,et al.  Studies on gesture expressivity for a virtual agent , 2009, Speech Commun..

[5]  H. Wallbott Bodily expression of emotion , 1998 .

[6]  Alex Pentland,et al.  Machine Understanding of Human Behavior , 2007 .

[7]  Marc Schröder,et al.  Dimensional Emotion Representation as a Basis for Speech Synthesis with Non-extreme Emotions , 2004, ADS.

[8]  Michio Sugeno,et al.  Fuzzy identification of systems and its applications to modeling and control , 1985, IEEE Transactions on Systems, Man, and Cybernetics.

[9]  Sung-Bae Cho,et al.  Combining multiple neural networks by fuzzy integral for robust classification , 1995, IEEE Trans. Syst. Man Cybern..

[10]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[11]  M. Kendall Statistical Methods for Research Workers , 1937, Nature.

[12]  K. Scherer,et al.  How universal and specific is emotional experience? Evidence from 27 countries on five continents , 1986 .

[13]  Alex Pentland,et al.  Human Computing and Machine Understanding of Human Behavior: A Survey , 2007, Artifical Intelligence for Human Computing.

[14]  Simon Brown,et al.  Affective gaming: measuring emotion through the gamepad , 2003, CHI Extended Abstracts.

[15]  Michael Neff,et al.  Towards Natural Gesture Synthesis: Evaluating Gesture Units in a Data-Driven Approach to Gesture Synthesis , 2007, IVA.

[16]  Radoslaw Niewiadomski,et al.  Multimodal Complex Emotions: Gesture Expressivity and Blended Facial Expressions , 2006, Int. J. Humanoid Robotics.

[17]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[18]  K. Roelofs,et al.  Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures , 2013, Front. Psychology.

[19]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[20]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[21]  Maurizio Mancini,et al.  Expressive Copying Behavior for Social Agents: A Perceptual Analysis , 2012, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[22]  R. Fisher,et al.  Statistical Methods for Research Workers. , 1955 .

[23]  Sotaro Kita,et al.  Cross-cultural variation of speech-accompanying gesture: A review , 2009, Speech Accompanying-Gesture.

[24]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[25]  William Curran,et al.  Laughter Type Recognition from Whole Body Motion , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[26]  Philippe Blache,et al.  Intensive gestures in French and their multimodal correlates , 2007, INTERSPEECH.

[27]  Kostas Karpouzis,et al.  Natural Interaction Multimodal Analysis: Expressivity Analysis towards Adaptive and Personalized Interfaces , 2012, 2012 Seventh International Workshop on Semantic and Social Media Adaptation and Personalization.

[28]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.

[29]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[30]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[31]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[32]  Paul A. Cairns,et al.  Measuring and defining the experience of immersion in games , 2008, Int. J. Hum. Comput. Stud..

[33]  Ilias Maglogiannis,et al.  Natural interaction expressivity modeling and analysis , 2013, PETRA '13.

[34]  Antonio Camurri,et al.  A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media , 2010, IEEE Transactions on Multimedia.

[35]  Maurizio Mancini,et al.  Towards Real-Time Affect Detection Based on Sample Entropy Analysis of Expressive Gesture , 2011, ACII.

[36]  Atau Tanaka,et al.  Understanding Gesture Expressivity through Muscle Sensing , 2015, ACM Trans. Comput. Hum. Interact..

[37]  M. Milanova,et al.  Recognition of Emotional states in Natural Human-Computer Interaction , 2008, 2008 IEEE International Symposium on Signal Processing and Information Technology.

[38]  Stephen L. Chiu,et al.  Fuzzy Model Identification Based on Cluster Estimation , 1994, J. Intell. Fuzzy Syst..

[39]  Stefan Kopp,et al.  Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.

[40]  Kostas Karpouzis,et al.  Virtual agent multimodal mimicry of humans , 2007, Lang. Resour. Evaluation.

[41]  Sankar K. Pal,et al.  Fuzzy multi-layer perceptron, inferencing and rule generation , 1995, IEEE Trans. Neural Networks.

[42]  Loïc Kessous,et al.  Multimodal user’s affective state analysis in naturalistic interaction , 2010, Journal on Multimodal User Interfaces.

[43]  Thies Pfeiffer,et al.  Gesture Semantics Reconstruction Based on Motion Capturing and Complex Event Processing: a Circular Shape Example , 2013, SIGDIAL Conference.

[44]  Azriel Rosenfeld,et al.  Face recognition: A literature survey , 2003, CSUR.

[45]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[46]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[47]  J. Russell A circumplex model of affect. , 1980 .

[48]  Mary Ellen Foster Corpus-Based Planning of Deictic Gestures in COMIC , 2004, INLG.

[49]  Andrea Kleinsmith,et al.  Recognizing Affective Dimensions from Body Posture , 2007, ACII.

[50]  Xue Yan,et al.  iCat: an animated user-interface robot with personality , 2005, AAMAS '05.

[51]  Kostas Karpouzis,et al.  Robust Feature Detection for Facial Expression Recognition , 2007, EURASIP J. Image Video Process..

[52]  Juan J. Cerrolaza,et al.  Evaluation of accurate eye corner detection methods for gaze estimation , 2014 .

[53]  Adam Kendon,et al.  How gestures can become like words , 1988 .

[54]  Dirk Heylen,et al.  Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing , 2012, IEEE Transactions on Affective Computing.

[55]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[56]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[57]  Jean-Claude Martin,et al.  Gesture and emotion: Can basic gestural form features discriminate emotions? , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[58]  Rudolf von Laban,et al.  Effort: economy in body movement , 1974 .

[59]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[60]  Loïc Kessous,et al.  Modeling Naturalistic Affective States Via Facial, Vocal, and Bodily Expressions Recognition , 2007, Artifical Intelligence for Human Computing.

[61]  Rudolf Kruse,et al.  A neuro-fuzzy method to learn fuzzy classification rules from data , 1997, Fuzzy Sets Syst..

[62]  Hong Sl,et al.  Entropy conservation in the control of human action. , 2008 .

[63]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[64]  Maurizio Mancini,et al.  Design and evaluation of expressive gesture synthesis for embodied conversational agents , 2005, AAMAS '05.