The Essential Role of Human Databases for Learning in and Validation of Affectively Competent Agents.

[1]  S. Lea,et al.  Perception of Emotion from Dynamic Point-Light Displays Represented in Dance , 1996, Perception.

[2]  A. Tcherkassof,et al.  Facial expressions of emotions: A methodological contribution to the study of spontaneous and dynamic emotional faces , 2007 .

[3]  Christiane Fellbaum,et al.  Book Reviews: WordNet: An Electronic Lexical Database , 1999, CL.

[4]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[5]  Dirk Heylen,et al.  Annotating State of Mind in Meeting Data , 2006 .

[6]  Radoslaw Niewiadomski,et al.  Multimodal Complex Emotions: Gesture Expressivity and Blended Facial Expressions , 2006, Int. J. Humanoid Robotics.

[7]  Peter Robinson,et al.  Natural affect data — Collection & annotation in a learning context , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[8]  P. Ekman,et al.  Felt, false, and miserable smiles , 1982 .

[9]  Roger K. Moore Spoken language processing: Piecing together the puzzle , 2007, Speech Commun..

[10]  P. Niedenthal,et al.  Emotional state and the detection of change in facial expression of emotion , 2000 .

[11]  Radoslaw Niewiadomski,et al.  Perception of Blended Emotions: From Video Corpus to Expressive Agent , 2006, IVA.

[12]  S. Baron-Cohen,et al.  Mind Reading: The Interactive Guide to Emotions , 2003 .

[13]  Liya Ding,et al.  Three-Dimensional Shape and Motion Reconstruction for the Analysis of American Sign Language , 2006, 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'06).

[14]  Roddy Cowie,et al.  The challenges of dealing with distributed signs of emotion: Theory and empirical evidence , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[15]  Mitsuru Ishizuka,et al.  Assessing Sentiment of Text by Semantic Dependency and Contextual Valence Analysis , 2007, ACII.

[16]  K. Scherer,et al.  Are facial expressions of emotion produced by categorical affect programs or dynamically driven by appraisal? , 2007, Emotion.

[17]  John H. L. Hansen,et al.  Improved "TEO" feature-based automatic stress detection using physiological and acoustic speech sensors , 2005, INTERSPEECH.

[18]  Lori Lamel,et al.  Challenges in real-life emotion annotation and machine learning based detection , 2005, Neural Networks.

[19]  Aijun Li,et al.  Categorizing terms' subjectivity and polarity manually for opinion mining in Chinese , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[20]  Elisabetta Bevacqua,et al.  Expressive audio‐visual speech , 2004, Comput. Animat. Virtual Worlds.

[21]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[22]  I. Florin,et al.  Do Socially Anxious Children Show Deficits in Classifying Facial Expressions of Emotions? , 2002 .

[23]  Hatice Gunes,et al.  From the Lab to the real world: affect recognition using multiple cues and modalities , 2008 .

[24]  K. Scherer,et al.  Culture and emotion. , 1997 .

[25]  J. M. Carroll,et al.  Facial Expressions in Hollywood's Portrayal of Emotion , 1997 .

[26]  Andreas Stolcke,et al.  Prosody-based automatic detection of annoyance and frustration in human-computer dialog , 2002, INTERSPEECH.

[27]  Marian Stewart Bartlett,et al.  Classifying Facial Actions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Björn W. Schuller,et al.  Abandoning emotion classes - towards continuous emotion recognition with modelling of long-range dependencies , 2008, INTERSPEECH.

[29]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[30]  Nick Campbell,et al.  A corpus-based speech synthesis system with emotion , 2003, Speech Commun..

[31]  Peter Birkholz,et al.  Control of an articulatory speech synthesizer based on dynamic approximation of spatial articulatory targets , 2007, INTERSPEECH.

[32]  P. Salovey,et al.  Perceiving affective content in ambiguous visual stimuli: a component of emotional intelligence. , 1990, Journal of personality assessment.

[33]  K. Scherer,et al.  Multimodal expression of emotion: affect programs or componential appraisal patterns? , 2007, Emotion.

[34]  Roddy Cowie,et al.  FEELTRACE: an instrument for recording perceived emotion in real time , 2000 .

[35]  Judith A. Hall,et al.  Gender differences in judgments of multiple emotions from facial expressions. , 2004, Emotion.

[36]  Klaus R. Scherer,et al.  Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus , 2007, ACII.

[37]  Elmar Nöth,et al.  How to find trouble in communication , 2003, Speech Commun..

[38]  Andrea Kleinsmith,et al.  Cross-cultural differences in recognizing affect from body posture , 2006, Interact. Comput..