Exploiting emotions to disambiguate dialogue acts

This paper describes an attempt to reveal the user's intention from dialogue acts, thereby improving the effectiveness of natural interfaces to pedagogical agents. It focuses on cases where the intention is unclear from the dialogue context or utterance structure, but where the intention may still be identified using the emotional state of the user. The recognition of emotions is based on physiological user input. Our initial user study gave promising results that support our hypothesis that physiological evidence of emotions could be used to disambiguate dialogue acts. This paper presents our approach to the integration of natural language and emotions as well as our first empirical results, which may be used to endow interactive agents with emotional capabilities.

[1]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[2]  Cristina Conati,et al.  A Study on Using Biometric Sensors for Monitoring User Emotions in Educational Games , 2003 .

[3]  Kurt VanLehn,et al.  A Comparison of Tutor and Student Behavior in Speech Versus Text Based Tutoring , 2003, HLT-NAACL 2003.

[4]  Chet Langin,et al.  Languages and Machines: An Introduction to the Theory of Computer Science , 2007 .

[5]  Noam Chomsky,et al.  On Certain Formal Properties of Grammars , 1959, Inf. Control..

[6]  Stacy Marsella,et al.  Interactive pedagogical drama , 2000, AGENTS '00.

[7]  Jonathan Klein,et al.  Computers that recognise and respond to user emotion: theoretical and practical implications , 2002, Interact. Comput..

[8]  Thomas Rist,et al.  Integrating Models of Personality and Emotions into Lifelike Characters , 1999, IWAI.

[9]  Cristina Conati,et al.  Using Bayesian Networks to Manage Uncertainty in Student Modeling , 2002, User Modeling and User-Adapted Interaction.

[10]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[11]  Henry Lieberman,et al.  A model of textual affect sensing using real-world knowledge , 2003, IUI '03.

[12]  Rosalind W. Picard Toward computers that recognize and respond to user emotion , 2000, IBM Syst. J..

[13]  James C. Lester,et al.  Integrating Affective Computing Into Animated Tutoring Agents , 1997 .

[14]  Jennifer Healey,et al.  Digital processing of affective signals , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[15]  Mehryar Mohri,et al.  Finite-State Transducers in Language and Speech Processing , 1997, CL.

[16]  Michael Johnston,et al.  Finite-state Multimodal Parsing and Understanding , 2000, COLING.

[17]  P. Lang The emotion probe. Studies of motivation and attention. , 1995, The American psychologist.

[18]  J. E. Ball,et al.  Modeling the Emotional State of Computer Users , 1999 .

[19]  Finn V. Jensen,et al.  Bayesian Networks and Decision Graphs , 2001, Statistics for Engineering and Information Science.

[20]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[21]  Thierry Dutoit,et al.  The MBROLA project: towards a set of high quality speech synthesizers free of use for non commercial purposes , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.

[22]  Jennifer Healey,et al.  Affective wearables , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[23]  Jonathan Klein,et al.  Frustrating the user on purpose: a step toward building an affective computer , 2002, Interact. Comput..

[24]  E. Vyzas,et al.  Affective Pattern Classification , 2002 .

[25]  Jack Mostow,et al.  Adding Human-Provided Emotional Scaffolding to an Automated Reading Tutor That Listens Increases Student Persistence , 2002, Intelligent Tutoring Systems.