MAUI: a multimodal affective user interface

Human intelligence is being increasingly redefined to include the all-encompassing effect of emotions upon what used to be considered 'pure reason'. With the recent progress of research in computer vision, speech/prosody recognition, and bio-feedback, real-time recognition of affect will enhance human-computer interaction considerably, as well as assist further progress in the development of new emotion theories.In this article, we describe how affect, moods and emotions closely interact with cognition and how affect and emotion are the quintessential multimodal processes in humans. We then propose an adaptive system architecture designed to sense the user's emotional and affective states via three multimodal subsystems (V, K, A): namely (1) the Visual (from facial images and videos), (2) Kinesthetic (from autonomic nervous system (ANS) signals), and (3) Auditory (from speech). The results of the system sensing are then integrated into the multimodal perceived multimodal anthropomorphic interface agent then adapts its interface by responding most appropriately to the current emotional states of its user, and provides intelligent multi-modal feedback to the user.

[1]  Rajita Sinha,et al.  Multivariate Response Patterning of Fear and Anger , 1996 .

[2]  M. Dyer Emotions and their computations: Three computer models , 1987 .

[3]  Andrew Stern,et al.  Panel on affect and emotion in the user interface , 1998, IUI '98.

[4]  J. Stainer,et al.  The Emotions , 1922, Nature.

[5]  Garrison W. Cottrell,et al.  EMPATH: Face, Emotion, and Gender Recognition Using Holons , 1990, NIPS.

[6]  S. Planalp,et al.  Varieties of Cues to Emotion in Naturally Occurring Situations , 1996 .

[7]  Nicole Chovil Discourse‐oriented facial displays in conversation , 1991 .

[8]  N. Frijda,et al.  Can computers feel? Theory and design of an emotional system , 1987 .

[9]  Pattie Maes,et al.  Agents that reduce work and information overload , 1994, CACM.

[10]  B. Kushner Descartes' error. , 1998, Journal of AAPOS : the official publication of the American Association for Pediatric Ophthalmology and Strabismus.

[11]  Demetri Terzopoulos,et al.  Analysis of facial images using physical and anatomical models , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[12]  D. Goleman Emotional Intelligence: Why It Can Matter More Than IQ , 1995 .

[13]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[14]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Hans-Jörg Bullinger,et al.  Human-computer interaction : ergonomics and user interfaces , 1999 .

[16]  Brent Auernheimer,et al.  Physiological data feedback for application in distance education , 2001, PUI '01.

[17]  David Rumelhart,et al.  A neural network model of micro- and macroprosody , 1996 .

[18]  Akikazu Takeuchi,et al.  Speech Dialogue With Facial Displays: Multimodal Human-Computer Conversation , 1994, ACL.

[19]  R. Zajonc,et al.  Feeling and facial efference: implications of the vascular theory of emotion. , 1989, Psychological review.

[20]  Christine L. Lisetti Motives for Intelligent Agents: Computational Scripts for Emotion Concepts , 1997, SCAI.

[21]  D. Tucker,et al.  Neural mechanisms of emotion. , 1992, Journal of consulting and clinical psychology.

[22]  A. Pecchinenda The Affective Significance of Skin Conductance Activity During a Difficult Problem-solving Task , 1996 .

[23]  Anna Wierzbicka,et al.  Defining Emotion Concepts , 1992, Cogn. Sci..

[24]  P. Ekman,et al.  Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions , 1975 .

[25]  Joseph E LeDoux Brain mechanisms of emotion and emotional learning , 1992, Current Opinion in Neurobiology.

[26]  Alice J. O'Toole,et al.  Connectionist models of face processing: A survey , 1994, Pattern Recognit..

[27]  Anthony Jameson,et al.  User Modeling: Proceedings of the Sixth International Conference UM 97 Chia Laguna, Sardinia, Italy, June 2-5, 1997 (CISM International Centre for Mechanical Sciences) , 2003 .

[28]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[29]  Iain R. Murray,et al.  Toward the simulation of emotion in synthetic speech: a review of the literature on human vocal emotion. , 1993, The Journal of the Acoustical Society of America.

[30]  Margaret A. Boden,et al.  The philosophy of artificial intelligence , 1990, Oxford readings in philosophy.

[31]  G. Bower Mood and memory. , 1981, The American psychologist.

[32]  R. Zajonc,et al.  Affect and cognition: The hard interface. , 1985 .

[33]  Sati McKenzie,et al.  Machine Interpretation of Emotion: Design of a Memory-Based Expert System for Interpreting Facial Expressions in Terms of Signaled Emotions , 1993, Cogn. Sci..

[34]  Irfan Essa,et al.  Tracking facial motion , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[35]  T. J. Stonham,et al.  Practical Face Recognition and Verification with Wisard , 1986 .

[36]  Wendy S. Ark,et al.  The Emotion Mouse , 1999, HCI.

[37]  S. Vrana,et al.  The psychophysiology of disgust: differentiating negative emotional contexts with facial EMG. , 1993, Psychophysiology.

[38]  Jennifer Healey,et al.  SmartCar: detecting driver stress , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[39]  M. Rosenblum,et al.  Human emotion recognition from motion using a radial basis function network architecture , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[40]  James S. Simkin Kinesics and Context: Essays on Body Motion Communication , 1972 .

[41]  Michael J. Black,et al.  Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion , 1995, Proceedings of IEEE International Conference on Computer Vision.

[42]  R. Zajonc On the primacy of affect. , 1984 .

[43]  Joseph Bates,et al.  The role of emotion in believable agents , 1994, CACM.