Evaluating affective interactions: Alternatives to asking what users feel

In this paper, we advocate the use of behavior-based methods for use in evaluating affective interactions. We consider behavior-based measures to include both measures of bodily movements or physiological signals and task-based performance measures. INTRODUCTION Recent years have seen a large increase in research directed towards adding an affective component to human computer interaction. The ability to measure user affect has become important for intelligent interfaces that aim to either establish believable interactions or alter internal behavior based on the user’s affect. Evaluating and interpreting this measure presents a challenge because of many ambiguities related to affect definition, communication, and interpretation. Classical methods for evaluating affect tend to focus on questionnaires: asking you what you feel now, or interviews, perhaps after the experiment, with a video of your performance in front of you, asking you instant by instant to recall what you felt at each moment during the earlier task. While such “self report” methods are valuable, and we continue to use them in our work, this paper will highlight some alternatives to self-report of feelings. The discussion below is divided into two categories: body measures (e.g. changes in muscle activity), and task measures (e.g. better ability to solve a creative problem). BODY MEASURES OF AFFECT The last decade has brought great strides in giving computers affective perceptual abilities, with new sensors for physiology and for behavior, such as body-worn accelerometers, rubber and fabric electrodes, miniature cameras and microphones, and garment or accessory-type devices, along with new algorithms for recognizing patterns in the sensed signals such as recognition of facial expressions from video or of stress patterns from thermal imagery of the face and other physiological measures. Body measures are not presented here as a replacement for other measures, but rather as additional information that may help combat some of the difficulties encountered with questionnaires and other more subjective methods. Possibly the biggest advantage is that body measurements can be taken in parallel with the interaction rather than interrupting the user or asking him after the task. An exhaustive list of body-based measures is beyond the scope of this paper, however, Table 1 cites a sample of existing methods (leaving out lots of examples of publications in each of these categories, and also leaving out categories, e.g. EEG and ECG-based measures, and more). Clearly there are lots of possible body measures that may capture aspects of an affective state, including the combination of multiple modalities, which can reduce the uncertainty associated with using a single measure (Mednick et al. 1964; DeSilva et al. 1997; Huang et al. 1998; Picard et al. 2001; Kapoor et al. 2004) One benefit of these “body” measures is that they can provide additional insight into the user’s emotional state without directly relying on his cognitive judgment of his emotional state. Additionally, some of them can be used without the user’s knowledge, perhaps with the goal of limiting the amount of misinformation that may arise from his feeling of being monitored. (This can also be seen as a drawback if one is concerned about privacy and about the use of sensing without a person’s knowledge). LEAVE BLANK THE LAST 2.5 cm (1”) OF THE LEFT COLUMN ON THE FIRST PAGE FOR THE COPYRIGHT NOTICE. Table 1. Body-based measures of affect (partial set of examples) Modality Sensor Is it socially communicated? Comments Facial Activity Video (Tian et al. 200; Barlett et al. 1999; Donato et al. 1999; Cowie et al. 2001) Yes Facial expressions can differ significantly from genuinely felt feelings IR Video (Kapoor et al. 2003) Highlights pupils & specularities Usually works better than ordinary video when head moves (better eye detection) Thermal Video (Pavlidis et al. 2002) No Being explored to detect stress and other changes related to deception and frustration Posture Activity Force sensitive resistors (Smith 2000; Mota P Tan et al. 2003) Yes, but not as pressure Good results discriminating level of interest in students in computer learning interactions Hand Tension & Activity Force sensitive resistors or Sentograph (Clynes 1986; Reynolds 2001; Qi P Dennerlein et al. 2003) Varies; depends on gesture Can be sensed from handling of mouse, steering wheel, etc., and pressure has been shown to be higher during a frustrating task

[1]  Jack T. Dennerlein,et al.  FRUSTRATING COMPUTERS USERS INCREASES EXPOSURE TO PHYSICAL FACTORS , 2002 .

[2]  T. Sejnowski,et al.  Measuring facial expressions by computer image analysis. , 1999, Psychophysiology.

[3]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Mary Czerwinski,et al.  Subjective Duration Assessment: An Implicit Probe for Software Usability , 2001 .

[5]  Carson Jonathan Reynolds,et al.  The sensing and measurement of frustration with computers , 2001 .

[6]  Rosalind W. Picard,et al.  A computational model for the automatic recognition of affect in speech , 2004 .

[7]  Kevin Larson,et al.  The Aesthetics of Reading , 2005 .

[8]  Yuan Qi,et al.  Fully automatic upper facial action recognition , 2003, 2003 IEEE International SOI Conference. Proceedings (Cat. No.03CH37443).

[9]  A. Isen,et al.  Positive affect facilitates creative problem solving. , 1987, Journal of personality and social psychology.

[10]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  S. Mednick,et al.  INCUBATION OF CREATIVE PERFORMANCE AND SPECIFIC ASSOCIATIVE PRIMING. , 1964, Journal of abnormal psychology.

[12]  Teresa Marrin Nakra,et al.  The "Conductor's Jacket": A Device for Recording Expressive Musical Gestures , 1998, ICMC.

[13]  Yuan Qi,et al.  Context-sensitive Bayesian classifiers and application to mouse pressure pattern classification , 2002, Object recognition supported by user interaction for service robots.

[14]  Alex Pentland,et al.  A sensing chair using pressure distribution sensors , 2001 .

[15]  Ioannis Pavlidis,et al.  Human behaviour: Seeing through the face of deception , 2002, Nature.

[16]  Rosalind W. Picard The Galvactivator: A glove that senses and communicates skin conductivity , 2001 .

[17]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[18]  Yuri Ivanov,et al.  Probabilistic combination of multiple modalities to detect interest , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[19]  N. Schwarz Situated cognition and the wisdom in feelings: Cognitive tuning. , 2002 .

[20]  Atau Tanaka,et al.  A Wireless, Network-based Biosensor Interface for Music , 2002, ICMC.

[21]  Andreas Stolcke,et al.  Prosody-based automatic detection of annoyance and frustration in human-computer dialog , 2002, INTERSPEECH.

[22]  Tsutomu Miyasato,et al.  Bimodal Emotion Recognition by Man and Machine , 2007 .

[23]  Deborah A. Small,et al.  Heart Strings and Purse Strings , 2004, Psychological science.

[24]  A. Isen,et al.  The Influence of Positive Affect on Clinical Problem solving , 1991, Medical decision making : an international journal of the Society for Medical Decision Making.

[25]  Karen Kay-Lynn Liu,et al.  A personal, mobile system for understanding stress and interruptions , 2004 .

[26]  L. de Silva,et al.  Facial emotion recognition using multi-modal information , 1997, Proceedings of ICICS, 1997 International Conference on Information, Communications and Signal Processing. Theme: Trends in Information Systems Engineering and Wireless Multimedia Communications (Cat..

[27]  Marian Stewart Bartlett,et al.  Classifying Facial Actions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Henry Lieberman,et al.  A model of textual affect sensing using real-world knowledge , 2003, IUI '03.

[29]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[30]  Cynthia Breazeal,et al.  Recognition of Affective Communicative Intent in Robot-Directed Speech , 2002, Auton. Robots.

[31]  C. Elliott The affective reasoner: a process model of emotions in a multi-agent system , 1992 .

[32]  Rosalind W. Picard,et al.  Automated Posture Analysis for Detecting Learner's Interest Level , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.