Investigating Context Awareness of Affective Computing Systems: A Critical Approach

Abstract Intelligent Human Computer Interaction systems should be affective aware and Affective Computing systems should be context aware. Positioned in the cross-section of the research areas of Interaction Context and Affective Computing current paper investigates if and how context is incorporated in automatic analysis of human affective behavior. Several related aspects are discussed ranging from modeling, acquiring and annotating issues in affectively enhanced corpora to issues related to incorporating context information in a multimodal fusion framework of affective analysis. These aspects are critically discussed in terms of the challenges they comprise while, in a wider framework, future directions of this recently active, yet mainly unexplored, research area are identified. Overall, the paper aims to both document the present status as well as comment on the evolution of the upcoming topic of Context in Affective Computing.

[1]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[2]  D. Lenat The Dimensions of Context-Space , 1998 .

[3]  Samy Bengio,et al.  Automatic analysis of multimodal group actions in meetings , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Nick Campbell,et al.  How Do We React to Context? Annotation of Individual and Group Engagement in a Video Corpus , 2012, 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing.

[5]  Fei-Fei Li,et al.  ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Hugo Liu,et al.  ConceptNet — A Practical Commonsense Reasoning Tool-Kit , 2004 .

[7]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[8]  Li Chen,et al.  Evaluating recommender systems from the user’s perspective: survey of the state of the art , 2012, User Modeling and User-Adapted Interaction.

[9]  Shih-Fu Chang,et al.  To search or to label?: predicting the performance of search-based automatic image classifiers , 2006, MIR '06.

[10]  Paul Over,et al.  Creating HAVIC: Heterogeneous Audio Visual Internet Collection , 2012, LREC.

[11]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[12]  Mohammad Soleymani,et al.  Corpus Development for Affective Video Indexing , 2012, IEEE Transactions on Multimedia.

[13]  M. Bartlett,et al.  Machine Analysis of Facial Expressions , 2007 .

[14]  J. Russell A circumplex model of affect. , 1980 .

[15]  Maja Pantic,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .

[16]  Jürgen Schmidhuber,et al.  Bidirectional LSTM Networks for Improved Phoneme Classification and Recognition , 2005, ICANN.

[17]  Harold Knudsen,et al.  The effects of verbal statements of context on facial expressions of emotion , 1983 .

[18]  Peter J. Brown,et al.  The Stick-e Document: a Framework for Creating Context-aware Applications , 1996 .

[19]  R. Plutchik Emotion, a psychoevolutionary synthesis , 1980 .

[20]  Maja Pantic,et al.  Machine analysis of facial behaviour: naturalistic and dynamic behaviour , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[21]  Rongrong Ji,et al.  Large-scale visual sentiment ontology and detectors using adjective noun pairs , 2013, ACM Multimedia.

[22]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[23]  Gregory D. Abowd,et al.  Towards a Better Understanding of Context and Context-Awareness , 1999, HUC.

[24]  Peter J. Brown,et al.  Context-aware applications: from the laboratory to the marketplace , 1997, IEEE Wirel. Commun..

[25]  Simeon Keates,et al.  Temporal context and the recognition of emotion from facial expression , 2003 .

[26]  David R. Morse,et al.  Enhanced Reality Fieldwork: the Context Aware Archaeological Assistant , 1997 .

[27]  Carlos Busso,et al.  Modeling mutual influence of interlocutor emotion states in dyadic spoken interactions , 2009, INTERSPEECH.

[28]  Klaus R. Scherer,et al.  Advocating a Componential Appraisal Model to Guide Emotion Recognition , 2012, Int. J. Synth. Emot..

[29]  Zakia Hammal,et al.  Towards Context Based Affective Computing , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[30]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[31]  David Franklin,et al.  All gadget and no representation makes Jack a dull environment , 1998 .

[32]  Zakia Hammal,et al.  Pain monitoring: A dynamic and context-sensitive system , 2012, Pattern Recognit..

[33]  Rosalind W. Picard Future affective technology for autism and emotion communication , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[34]  Philip J. Stone,et al.  Extracting Information. (Book Reviews: The General Inquirer. A Computer Approach to Content Analysis) , 1967 .

[35]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[36]  Ingo Siegert,et al.  Annotation and Classification of Changes of Involvement in Group Conversation , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[37]  Christian D. Schunn,et al.  Integrating perceptual and cognitive modeling for adaptive and intelligent human-computer interaction , 2002, Proc. IEEE.

[38]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[39]  Gwen Littlewort,et al.  Automated measurement of children's facial expressions during problem solving tasks , 2011, Face and Gesture 2011.

[40]  George A. Miller,et al.  WordNet: A Lexical Database for English , 1995, HLT.

[41]  A. Pentland,et al.  Thin slices of negotiation: predicting outcomes from conversational dynamics within the first 5 minutes. , 2007, The Journal of applied psychology.

[42]  Shih-Fu Chang,et al.  Consumer video understanding: a benchmark database and an evaluation of human and machine performance , 2011, ICMR.

[43]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[44]  Marshall S. Smith,et al.  The general inquirer: A computer approach to content analysis. , 1967 .

[45]  Tamás D. Gedeon,et al.  Collecting Large, Richly Annotated Facial-Expression Databases from Movies , 2012, IEEE MultiMedia.

[46]  Dirk Heylen,et al.  The Sensitive Artificial Listner: an induction technique for generating emotionally coloured conversation , 2008 .

[47]  Zakia Hammal,et al.  Spontaneous Pain Expression Recognition in Video Sequences , 2008, BCS Int. Acad. Conf..

[48]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[49]  Ingo Siegert,et al.  The Influence of Context Knowledge for Multi-modal Affective Annotation , 2013, HCI.