Beyond the basic emotions: what should affective computing compute?

One of the primary goals of Affective Computing (AC) is to develop computer interfaces that automatically detect and respond to users' emotions. Despite significant progress, "basic emotions" (e.g., anger, disgust, sadness) have been emphasized in AC at the expense of other non-basic emotions. The present paper questions this emphasis by analyzing data from five studies that systematically tracked both basic and non-basic emotions. The results indicate that engagement, boredom, confusion, and frustration (all non-basic emotions) occurred at five times the rate of basic emotions after generalizing across tasks, interfaces, and methodologies. Implications of these findings for AC are discussed

[1]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[2]  Rosalind W. Picard Affective Computing: From Laughter to IEEE , 2010 .

[3]  L. F. Barrett Are Emotions Natural Kinds? , 2006, Perspectives on psychological science : a journal of the Association for Psychological Science.

[4]  P. Wilson,et al.  The Nature of Emotions , 2012 .

[5]  K. Meyer,et al.  A Mixed-Methods Assessment of Using an Online Commercial Tutoring System to Teach Introductory Statistics , 2009 .

[6]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[7]  Sidney K. D'Mello,et al.  Emotions during Writing on Topics That Align or Misalign with Personal Beliefs , 2012, ITS.

[8]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[9]  P. Petta,et al.  Computational models of emotion , 2010 .

[10]  Georgios N. Yannakakis,et al.  Towards affective camera control in games , 2010, User Modeling and User-Adapted Interaction.

[11]  Arthur C. Graesser,et al.  Predicting Affective States expressed through an Emote-Aloud Procedure from AutoTutor's Mixed-Initiative Dialogue , 2006, Int. J. Artif. Intell. Educ..

[12]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Sidney K. D'Mello,et al.  Consistent but modest: a meta-analysis on unimodal and multimodal affect detection accuracies from 30 studies , 2012, ICMI '12.

[14]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[15]  Sidney K. D'Mello,et al.  Monitoring Affect States During Effortful Problem Solving Activities , 2010, Int. J. Artif. Intell. Educ..

[16]  Jacob Cohen,et al.  A power primer. , 1992, Psychological bulletin.

[17]  P. Rozin,et al.  High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of Americans. , 2003, Emotion.

[18]  S. D’Mello A selective meta-analysis on the relative incidence of discrete affective states during learning with technology , 2013 .

[19]  Rosalind W. Picard Affective Computing , 1997 .

[20]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[21]  Ana Paiva,et al.  LEARNING BY FEELING: EVOKING EMPATHY WITH SYNTHETIC CHARACTERS , 2005, Appl. Artif. Intell..

[22]  Arthur C. Graesser,et al.  AutoTutor: an intelligent tutoring system with mixed-initiative dialogue , 2005, IEEE Transactions on Education.

[23]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[24]  P. Ekman An argument for basic emotions , 1992 .